US Department of Agriculture (USDA), Animal and Plant Health Inspection Service (APHIS) – NAIS Performance and Scalability Analysis

The Animal and Plant Health Inspection Service (APHIS) was created in 1972 and is a science-based organization whose primary goal is protecting American agriculture.  APHIS’ mission is to provide leadership in ensuring the heath and care of animals and plans, to improve agricultural productivity and competitiveness, and to contribute to the national economy and the public health.

APHIS contracted with Technik to test and analyze the performance and scalability of the NAIS to prepare for the program’s anticipated future performance needs.

[tabs] [tab title=”Project Accomplishments”]

Project accomplishments and associated activities are described below.

Task 1: Project Management and Reporting

Technik met with the COTR to discuss timing, contacts, documentation and other pertinent information. We performed an analysis of the work and developed a project schedule and a project management plan (PMP). We developed and presented a project schedule with milestones that corresponded to the CLINs. Work was re-evaluated and re-prioritized at the end of each CLIN. Functionality reports, acceptance testing, code check-in and documentation was accomplished to coincide with the sprint cycles. As required by VS, Technik utilized Earned Value Reports to measure our productivity. All work was performed in compliance with the development environment methods for source control (SVN), unit testing (nUnit), and documentation.

Deliverables included: project management plan and project master schedule, weekly status meeting minutes and monthly project status reports.

Task 2: Software Development, Requirement Analysis and Operations/Maintenance

The NAIS IT team developed a suite of applications over the past five years to support the business functions of the NAIS program.Technik tested the capacity of the system, the hardware and software configuration, the architectural “best practices”, and adherence to 508 standards. We specifically completed the following tasks.

    1. Created a base data set for SPRS, NPIR, AINM, ATPS, SPRS, NPIR and AINM
    2. Created a tool for parsing data sets 1-4 above by a user defined percentage
    3. Benchmarked all current applications individually and the system as a whole in production or a production like environment using NAIS usage trends for the following conditions:
      1. Current user/record load (as is in the current production environment)
      2. Possible increased future load
      3. Worst-case scenario load:
    4. Review and document hardware/software/application/architecture/configuration and recommend changes to optimize system for each scenario.
    5. Review the current application architecture for possible performance related issues and document recommendations.
    6. Review and document NAIS databases (SPRS and NPIR) for inconsistent structure – Identify and report on all data elements in SPRS that differ in definition, usage, or meaning from the corresponding data elements in NPIR.
    7. Review and document current NAIS data sets (SPRS and NPIR) for Inconsistent data – Identify and report on all records in SPRS that differ in content from the corresponding records in NPIR.
    8. Review and document current NAIS data sets (SPRS and NPIR) for inappropriate data – Identify and report on all records in NPIR/SPRS that do not have an address that qualifies as a Wireline Enhanced 911 address. These addresses are used by the emergency E911 offices to determine real physical locations, and do not include any non-location specific post office addresses such as PO Box or Rural Route addresses.
    9. Review and document current NAIS data sets (SPRS and NPIR) for inaccurate data – Identify and report on all records in NPIR/SPRS premises address lists that have a latitude/longitude stored based on a zip code centroid latitude/longitude point. Comparison should match within 3 decimal digits to be counted as a match. Comparisons must be based on the most recent national zip code centroid definitions.
    10. Review and document current NAIS data sets (SPRS and NPIR) for inaccurate data – Identify and report on all records in NPIR/SPRS premises address lists that have an address location in the wrong county, and what the correct county should be, based on the most recent national county definitions.
    11. Report recommendations – Analyze, define and report on recommendations to resolve structural differences and data deficiencies identified in these previous tasks.
  1. Execute tools to measure automated test code coverage by all NAIS applications (SPRS, AINM, and ATPS) and document findings.
  2. Analyze all NAIS applications (SPRS, DMC, AINM, ATPS) for 508 compliance standards.
  3. Manage ATPS Suite 3 Testing by completing the following tasks
    1. Develop and document the Suite 3 test design (Group/lot id was not be considered for this round of testing)
    2. Build dataset adequate for testing – This data set contained tens of millions of records distributed appropriately over millions of premises. VS provided guidance on creating the dataset to ensure the data is distributed appropriately and that there is a realistic model for simulating animal movement.
    3. Run an initial suite 3 test with simulated ATDs (The code to simulate ATDs will was provided.
  4. Organize Suite 3 testing for ATPS by completing the following tasks
  1. Distribute data set
  2. Coordinate ATDs
  3. Manage testing
  4. Analyze and report results from tests
  5. Make recommendations to improve future testing

Deliverables included: updated source code in the version control system, completion and documentation of clear quest tickets and other applicable documentation. Specific CLIN deliverables included Application and System Architecture Report, Data Set and Methodology Report for Performance and Scalability testing,  Performance and Scalability Analysis Report, Automated Test Code Coverage Report, 508 Compliance Report, ATPS Suite 3 Testing Plan Report, ATPS Suite 3 Dataset for Testing and ATPS Suite 3 Testing Report.

Task 3: Performance Metrics/ Criteria for Acceptance

Performance Metrics Reports used quality, responsiveness, timeliness, narratives, staffing/ vacancy reports, invoices, and accuracy of data.

Overall Customer Satisfaction: Excellent
Delivery: Technik adhered to all deadlines and produced deliverables in a timely manner.
Cost Control: Technik performed within cost and budget.
Corrective Actions Taken: None

Technologies Used

Technologies used include:  Java Platform, Enterprise Edition(J2EE), Microsoft (MS) SQL Server, MS Access, Microsoft Share Point, Struts, Velocity, Spring, JQuery, iBatis, Struts-menu, Spring Security, Spring AOP, Web Services SOAP via HTTPS and FTPS, XML, BEA Weblogic Server, Custom EJBs ,BPEL and BEA Web logic Portal