Skip to content

Instantly share code, notes, and snippets.

@sterlinm
Created March 14, 2014 04:26
Show Gist options
  • Save sterlinm/9542130 to your computer and use it in GitHub Desktop.
Save sterlinm/9542130 to your computer and use it in GitHub Desktop.
Process Assessment Document

Process Assessment Document

Premise

You are a member of an independent quality organization charged with assessing the effectiveness of a software development team. The team has just completed the development of a Java project using a specified Software Process. Prepare a document containing the information described below.

Assume you have access to the following items:

  • The team's deliverable documents and source code
  • The team's defect log
  • The team's time record

For the Android “Team03:TODOER” prototype development project, the team utilized the evolutionary prototype software development process. The prototype process is an iterative process. Evolutionary Prototyping is different from Throwaway Prototyping for instance. The main goal was to build a robust prototype in a structured manner and constantly refine it. The reason for this is that the Evolutionary prototype, when initially built, forms the heart of the new system, and the improvements and further requirements are built. This process acknowledges that requirements are not always well understood from the beginning of the project. This technique has provided the ability for this team to add features and make changes that could not be initially conceived during the requirements and design phase. The product was constantly evolving through its use on Android emulators and devices. The team made appropriate assumptions about the way the user would want to use the application and a plan was enacted to develop the capability that resembled what was envisioned by the members of the team. The evolutionary prototypes process gave the team an advantage over some of the other prototype process (e.g throwaway prototyping) since it was a functional application early on in the process. The features were added as it was used and were continuously developed until the final application was delivered. Some of the team members actually put the prototype to practical use until the final application was delivered. As the early application prototypes were tested by being used, it provided opportunities for new features to be recommended and provide requests for these features to be added to the next version of the prototype. This methodology also provided the opportunity for the developer to take the enhancement requests, along with his own and change the software-requirements specification, update the design, recode and retest.

Throughout the duration of the project, the team sought to continuously test and provide feedback and they did achieve that goal. The stressors of this methodology and team dynamic were:

  • Lack of familiarity of each other within the team
  • Ramp up on some tools
  • Time zone differences of the team members
  • Inclement weather
  • Conflicting employment work schedules, travel and work demands
  • Conflicting demands for deliverables on other GATECH classes
  • Differing opinions of what the next highest priority was to address

Document Format

###1 Team members:

  • Jeff Fry
  • Tommy Gaidus
  • Susan Prescott
  • Michael Sterling

2. Process used

Evolutionary Prototyping

###3 Activity assessment For each activity the team used, provide:

  • Activity name
  • Quality of the deliverable: (0-100, with rationale for score)

####3.1 Conceptualization / Process Plan The ability to have a working prototype early on in the process made conceptualization a non-issue for Team 3. The opportunity to show and demonstrate a working prototype to the "customer" early in the process provided for a high level output on the and scope of design. In general, if the team had more time to produce code and show prototypes which enhanced the deliverable, results in a high quality score not only have met the needs of the requirements but also surely delighted the application users.

Process Plan Score = 98 #####Pros The process plan clearly defined the required deliverable items, the project owners, and event milestones required to complete the project in the time allowed. All team members communicated well once the skills and abilities of the team members were established. This made the process planning efficient. #####Challenges Any team working on a project like this would benefited from an initial face-to-face kick-off meeting to formally clarify team roles, establish an overall purpose, and discuss the key milestone objectives in detail. However the online initial kick-off went extremely well, meeting notes were sent and agreed upon and appropriate roles assumed. Constraints of the team members schedules were taken into consideration right from the get-go. Despite the production of a phenomenal final product and documentation suite, much of the ultimate planning for the project happened by way of email throughout the prototyping process. The team solved this by establishing periodic open bridge sessions, where one member would open an audio bridge for several hours and others could join as time permitted within the announced time-frame.

####3.3 Design Document The design document was one that our team focused on extensively during development. In general, this design document was written to describe what the application we build did, what the possible failure conditions were, how they were handled, and what operations are done at execution.

Design Document Score = 96 #####Pros In general, the finished design document was thorough and complete. The document was well focused on the application objective and functionality, as well as the the high level entities in the design, the usage of various parts of the system, as well as the sequencing and structure for the components and their relationships. #####Challenges In general, there was one challenge with the design document which as not having a real customer to interact with. This complicated the the design documentation writing somewhat. Due to the simplicity of the concept of the application, this was easily overcome.

####3.4 Java Source Code The Java Source code that was provided for the assignment was well done & satisfied all requirements. Furthermore, the code was built using two critical software engineering tenets, maintainability and performance. Below is an articulation supporting the given score of 99 for this area.

Full Java Source Code Score = 99

Maintainability
In general, the created code had well written comments that the average user could understand. This is key, because any developer coming in later in the process will likely have a hard time understanding what the original intent of the code was. Uncertainty could lead to an increases in the number of bugs and new features.

Performance
In general, the produced code met all the performance requirements. It did not fail in situations where the submitted document or text contained errors which might have been specified outside of the initial build. The flow and order of the code was optimal, from a design perspective.

####3.5 Completed System Testing Checklist The system testing checklist was well thought out, and provided varied scenarios to test the accuracy of the application. In particular, the testing document clearly outlined the purpose, steps and expected results for each test. described the rationale for testing, and also included examples of 27 test cases which were run through the model. The test cases were intentionally built cause problems with the application, but all functioned properly ultimately. Bugs that were found with earlier versions of the prototype were documented in Gitbug tracker and recorded in the test plan.

Completed System Testing Checklist Score = 98 #####Pros The testing document laid out the conditions and approach for testing the product, with a list of test examples which were actually used to ensure that the application worked correctly. An area was provided to record the results. #####Challenges One main challenge attempting to create all test scenarios manually. From a testing perspective, 27 cases seems like a small sample but it was still thorough. That said, the team covered a great deal in these test scenarios.

####3.6 User Instructions Below is the analysis and feedback relating to the user instructions documentation.

User Instructions Score = TBD #####Pros

#####Challenges

###4 Productivity Assessment In general, the team was required to be exceptionally attentive to finish this project. The total time spent on the project was just over 150 hours. The total lines of code used were over nnn. The total lines of user documentation was ????. Below is the productivity score that was given by the team:

Productivity Assessment Score = 99
Collectively, the team felt that this score was well deserved. The team was made up of very driven individuals who all had tremendous work ethic and good teaming skills. Success on this project would not have been possible without the work put in by every team member.

  • Total time spent 150
  • Total lines of code in delivered product nnnn
  • Total lines of user documentation
  • Productivity score: (0-100, with rationale for score)

###5 Quality Assessment Below is a summary of the quality assessment for the project.

  • Test scenarios run: 27
  • Total defects found: 6
  • Total number of defects fixed: 6

Quality Assessment Score = 100

###6 Recommendations In general, during the course of the project, the team was able to agree to several key recommendations which will ensure an even higher level of efficiency, effectiveness and overall success on future course projects.

  1. Get initial clarity around the scope of the project in a face to face kick off meeting if geographic proximity permits. This will allow for an easier transition to a new team and understanding each other’s work styles and expectations of each other.
  2. Centralize communication within the team. During the project, the team tried to get together to speak and share documents for review. Because of scheduling constraints everyone was not alway able to be present. Googledocs, eamil and gitbug tracker was relied on heavily as compensation for the ability to meet often. Better use of GitHub would be recommended for versioning of documentation.
  3. Improved versioning. Versioning is something that the team could do even more effectively,. This includes more frequent check-ins of work in process, and better documentation of versioning changes. This type of check-in, as well as updating, will allow the other team members to understand each others thought processes as they proceed. This will improve communication and allow the team members to be more comfortable that work is being addressed.

###7 Summary Below is a one-paragraph summary of our team's assessment.

Overall the team worked extremely well together and achieved great success on this project. During the project, the difference in time zones was a challenge that was remedied through compromise and use of tools such as GitHub tracker, email and Google Docs. The team did a good skills assessment in their initial meeting which enabled every member of the team to co-contribute to their peers development both technically and theoretically. Throughout the entire project, every member contributed as expected by their team members. There was a drive to create a high quality output and final application product with extensive documentation that fit the bill & exceeded the stated expectations.

###8 Caveat You are assessing your own team, so you have a conflict of interest: the more problems you detect with your team's process, the worse your team looks. That is why I have structured this exercise as if you were an independent team from a separate organization. It is your job to be objective. The TA and I will also be evaluating the development team's process. The closer your evaluation score is to ours, the better you will do on this exercise. So be critical but fair.

@jefferyfry
Copy link

Mike, I just left a comment here. Were you notified?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment