<This was posted to the
newsgroup, auto copied to alf-dev but not alf-req, I apologize if this is a
repeat for some>
Segue has reviewed the POC
Use Case docs and proposes the following extensions/additions:
Build
Complete Event ======================
Re: “POC Application Requirements Use
Blueprint”, p.4, AWS2
Re: “ALF Use Cases: Process & Service
Flows”, p.4-5
The ALF
common vocabulary should be extended to include product, version, build, and a
test type enumeration (e.g., “Full”, “Regression”, “Load”,
“Smoke”, NULL). This vocabulary is already shared by several ALM
tools. Allowing use of this vocabulary to be optional has a big disadvantage:
it throws the burden upon the tools to do an internal mapping, an onerous
point-to-point type integration task.
The
composite of Product, Version, Build, and Test Types uniquely identifies the
tests the Test Management System will execute and the binding to scheduling,
locations, etc. Deployment and source locations are not needed. The
ExtendedData for the Build Complete event must include at the minimum Product,
Version, Build, and TestType elements of the form:
<ExtendedData>
<Data Name=“Product”>ALF Designer Product</Data>
<Data Name=“Version”>1.0</Data>
<Data Name=“Build”>1451</Data>
<Data Name=“TestType”>Smoke</Data>
[...]
</ExtendedData>
Test Build Service Flow
========================
Re: “ALF Use Cases: Process & Service Flows”, p.5 & 9.
There
is no need to add test results at the BPEL level to the Test Manager
tool after the Test Service executes. This process
is taken care of internally
by the Test Management tool. The service flow does
not need to be aware
or concern itself with how the results are stored
in a common repository.
Scan Code Service Flow
=======================
Re: “ALF Use Cases: Process & Service Flows”, p.8
The
flow shows a BPEL operation of “Add Scan Findings to Test Management”.
Segue proposes that this be done through an
interface to the Test Management service that it will provide. The input
arguments would be:
* Product
* Version
* Build * Test Definition with Success Metrics
Array
* Attachments (e.g., links and files).
The success
metrics would be raw data. For a security scan example, time values would be
given as the metric for patch latency or integers as the metric for SANS/FBI
Top 20 compliance. Success criteria for these raw success metrics should be defined
within the Test Management System. Futhermore, the Test Manager should expose
these metrics categorized as passed/failed according to the criteria as well as
their statistical evaluations based upon the test definitions.
Segue will provide this interface in its submitted WSDL.
Test Build Complete Event
=====================
Re: “POC Application Requirements Use Blueprint”, p.2, AEB13
Segue
proposes that the Event Manager handle Test Build Complete events. In addition
to the standard header, the event would contain ExtendedData describing the
number of tests passed, failed, and unable to be executed. The ExtendedData
would be of the form:
<ExtendedData>
<Data Name=“Product”>ALF Designer Product</Data>
<Data Name=“Version”>1.0</Data>
<Data Name=“Build”>1451</Data>
<Data Name=“TestType”>Smoke</Data>
<Data Name=“NumTestsPassed”>12</Data>
<Data Name=“NumTestsFailed”>3</Data>
<Data Name=“NumTestsUnableExe”>1</Data>
[...]
</ExtendedData>
Thanks,
Bob
Bob Brady, Ph.D.
Senior Software Engineer
Segue Software
201 Spring Street
Lexington, MA 02421
t: 781-402-5975
f: 781-402-1097
w: www.segue.com