Testing ILL Protocol Implementations

What's New
Standards
Register
Implementations
Testing ILL Protocol Implementations
Reading
ASN.1 Index

Home
Site Index

In this Document:

Introduction

ILL Interoperability
Test Steps

Composition of Test
Parties

Test Case Selection

Interoperability Test
Campaign

Reporting
Interoperability Test
Results

Interoperability Test
Results Analysis

Role of the Testing
Coordinator


Other Testing
Documents

Background Reading

ILL Protocol Interoperability
Testing

Test Beds


For More Information,
contact
the ILL Application Standards
Maintenance Agency,
Library and Archives Canada

ill_asma
@lac-bac.gc.ca

Last Update: 2002/01/15



Testing ILL Protocol Implementations
Interlibrary Loan Application Standards Maintenance Agency


ILL PROTOCOL IMPLEMENTATION INTEROPERABILITY TESTING

ILL PROTOCOL IMPLEMENTATION INTEROPERABILITY TESTING

Prepared by the National Library of Canada for the ILL Protocol Implementation Programme Interoperability Testing, July 1992

Revised by Barbara Shuh, Interlibrary Loan Application Standards Maintenance Agency, National Library of Canada July 1997 as Background Information for ILL Protocol Implementors Group (IPIG)

INTRODUCTION

The objective of Interoperability Testing is to verify that implementations of the ISO Interlibrary Loan (ILL) Protocol Standard (ISO 10160/10161) can interoperate, that is, communicate and provide the services stipulated by the Standard.

This document describes a methodology to be used for ILL Interoperability Testing and the participating partners. Interoperability testing is limited to normal operation and verifies that each implementation can invoke the services that it supports. Test cases focusing on the use of optional APDUs have also been defined.

The procedure described here was developed by the National Library of Canada (NLC) for the testing of Canadian ILL protocol implementations in 1992.

During this testing, NLC played a coordinating and technical supporting role.

The Canadian interoperability test suite was designed with the assumption that each implementation had undergone conformance testing and that the implementation conformed to the ISO ILL Standard and the ILL Interim Canadian Standardized Profile. It could be adapted for use by IPIG implementors.

ILL INTEROPERABILITY TEST STEPS

The process for ILL interoperability testing consists of the following steps:

Test Setup

  • Select test parties
  • Establish test plans for each member of the test party

    Test Campaign

    Each member of the test party:
  • Starts testing when all partners within a test party have received their test plans.
  • Executes the cases listed in the test plan
  • Respond when a transaction is initiated by a partner
  • Uses the Test Results form to record the events of all transactions including problems which occur during the test campaign.
  • Returns the test results to the testing co-ordinator at the completion of the test campaign.
    To minimize the need for coordination of the partners during the test campaign, the details for test case execution should be provided within the test plan.

    Results Analysis

    Testing Coordinator:
  • Analyzes test results
  • Generates a report for each implementation
    The objective of the report is to provide statistics on the interoperability testing and list problems which occurred.

    COMPOSITION OF TEST PARTIES

    A test party consists of a number of implementors that have successfully completed conformance testing. The composition of test parties is based on two objectives:

  • Providing a sufficient number of participants to play all required roles for testing ILL services,
  • Allowing each implementation to test against at least two other implementations.

    To meet the above objectives, a test party should be composed of at least three participants. Thus each implementation will have a sufficient number of partners for testing the FORWARD service. For testing other services, each implementation plays the role of the requester for one partner, and the role of the responder for the other. This ensures that the implementations are tested with at least two others.

    TEST PARTY (Four Participants)

    Requester Role

    Responder Role

    Final Responder (for Forwarding)

    PARTICIPANT 1

    PARTICIPANT 2

    PARTICIPANT 3

    PARTICIPANT 2

    PARTICIPANT 3

    PARTICIPANT 4

    PARTICIPANT 3

    PARTICIPANT 4

    PARTICIPANT 1

    PARTICIPANT 4

    PARTICIPANT 1

    PARTICIPANT 2

     

    TEST PARTY (Three Participants)

    Requester Role

    Responder Role

    Final Responder (for Forwarding)

    PARTICIPANT 5

    PARTICIPANT 6

    PARTICIPANT 7

    PARTICIPANT 6

    PARTICIPANT 7

    PARTICIPANT 5

    PARTICIPANT 7

    PARTICIPANT 5

    PARTICIPANT 6

    The tables above illustrate the roles the participants will play with each other. A test plan will be defined for each participant. The above tables can be used in determining the partners during test case selection. It may be necessary to modify the partner if a service is not supported by the partner.

    TEST CASE SELECTION

    The Implementation Under Test (IUT) and its testing partners should select test cases based on the ILL services that their ILL protocol implementations support. They can use the test plan proforma, ILL INTEROPERABILITY TEST PLAN, which provides the required tables for making this selection.

    A test plan for each participant is completed for each participant. It should:

  • list the test cases to be executed by an IUT.
  • identify the testing partners
  • provide any information, such as e-mail address within a test party and the library symbols, required for establishing communications between partners.

    The Test Plan, when completed, should identify all test cases to be run by the implementation and the partner associated with each test case. The content of each test is specified in the Interlibrary Loan (ILL) Interoperability Test Suite.

    Each table in the test plan contains the following information:

    Common Services:
    identifies the ILL services required by all test cases in a specific test group.
    SERVICES SUPPORTED
    identifies the services required by the test cases listed in the IUT CASE and PARTNER CASE columns.
    IUT CASE:
    indicates the test case which the implementation operator executes.
    PARTNER CASE:
    indicates the test case which the corresponding partner executes.
    PARTNER SYMBOL:
    institution symbol used by the testing partner during the exchange of test cases. Identifies the testing partner.

    DATE COMPLETED:
    information supplied by the IUT operator to indicate the date that the case is completed. Use the annotation "NOT RUN" if either the IUT or its partner does not support a service required by the test case.

    See the sample ILL INTEROPERABILITY TEST PLAN TABLE 4. This table lists the requester test cases which test the ILL services required in transactions of returnable items.

    In this scenario, the annotation "NOT RUN" is entered in "DATE COMPLETED" column for CASE153 and CASE154, since the implementation does not support the RENEW service and its partner does not support the OVERDUE service. The institution-symbol SYMB2 found in the PARTNER SYMBOL column identifies the partner that plays the responder role and corresponds to the partner identified in the Test Party.

    ILL INTEROPERABILITY TEST PLAN

    Table 4 - GROUP: INTEROP/REQUEST/RETURN
    Common Services: ILLreq, SHIind, RCVreq, RETreq, CHKind

    SERVICES SUPPORTED

    IUT CASE

    PARTNER CASE

    PARTNER SYMBOL

    DATE COMPLETED

    RCLind

    CASE150

    CASE250

    SYMB2

     

    RETreq

    CASE151

    CASE251

    SYMB2

     

    CHKind

    CASE152

    CASE252

    SYMB2

     

    DUEind

    CASE153

    CASE253

     

    NOT RUN

    RENreq, REAind

    CASE154

    CASE254

     

    NOT RUN

    LSTreq, LSTind

    CASE155

    CASE255

    SYMB2

     

    DAMreq, DAMind

    CASE156

    CASE256

    SYMB2

     

    INTEROPERABILITY TEST CAMPAIGN

    Before testing begins, participants must contact their partners. The contact person, telephone number, library symbol, etc. can be found in the test plan.

    During the test campaign, an IUT operator will initiate the transactions for the cases where it plays the role of the requester. The cases to be executed are identified in the test plan and specified in the ILL Interoperability Test Suite. A case specification identifies all events which occur in a test case (see the example below). Consult the document "ILL Interoperability Test Suite" for a description of the notation used in specifying test cases.

    Test Case Dynamic Behaviour

    Reference: INTEROP/REQUEST/RETURN/CASE154
    Identifier: T30
    Purpose: Testing additional services required for
    returnable items

    Renew: The REN APDU generated by the requester should
    have all parameters filled with as much data as possible.
    Renew-Answer: The REA APDU generated by the responder
    should have all parameters filled with as much data as
    possible.

    Defaults Reference:

     

    Behaviour Description

    Constraints Reference

    Verdict

    <IUT!ILLreq>
     L!ILL
      L?SHI
       <IUT?SHIind>
        <IUT!RCVreq>
         L!RCV
          <IUT!RENreq>
            L!REN
             L?REA
              <IUT?REAind>
               <IUT!RETreq>
                L!RET
                 L?CHK
                  <IUT?CHKind>
    
    BSILLreq001
    
    
    BSSHIind001
    BSRCVreq001
    BSRENreq000
    
    
    
    BSREAind000(?)
    BSRETreq000
    
    
    BSCHKind000
     

    Thirty-four test cases have been defined for each role. Thus and implementation will run a maximum of 68 test cases during a test campaign. Before executing a test case, the operator should consult the Purpose section of the Dynamic Behaviour Table. It provides specific instructions for executing the case.

    In the example of Test Case Dynamic Behaviour, CASE154, the requester is encouraged to provide data for all parameters in the RENreq service and the responder, data for all parameters in the REAreq service.
    In addition to these specific instructions, the IUT operator should perform the following:

  • When submitting a request, add the IUT case name to the requester-note parameter to allow the partner to identify the case to which the transaction belongs. The partner will then be able to associate the case name to the partner case in his/her test plan.

  • If your implementation does not supply an optional APDU when a related service is invoked (SHIreq, RCVreq, RETreq, or CHKreq), invoke a STRreq to let the partner know that the service has been invoked and the optional APDU not sent. The IUT invokes two request services while the partner would get one indication service.
    For example, the responder invokes the SHIreq (no SHI APDU is sent) followed by the STRreq. The requester would only receive the STR APDU and get an STRind.

  • When executing the tests, record the events which occur during each test case and preserve all logs relating to the test case.

    Testing partners should discuss how test cases are to be executed.

    For better efficiency, concurrently execute the test cases belonging to the same test group. If using store and forward communications, the length of time it takes to execute the cases one at a time is not practical. The groups should be processed in the following order:

    • BASIC
    • NONRET
    • OPTAPDU
    • RETURN
    • FORWARD

    This order reflects the increasing complexity of the ILL transactions defined in the test cases. It is also possible to execute all cases at once.

    As well, concatenate a group of APDUs into one ILL message. For example, when executing the BASIC group, the requester initiates three transaction, one for each of the cases CASE100, CASE101, and CASE102 and combines them into one message.

    Each testing partner should start initiating the transactions for the requester group as indicated above. This means that implementations will be processing transactions as a responder and a requester simultaneously, reflecting the normal operations of a library.

    REPORTING INTEROPERABILITY TEST RESULTS

    Use the Interoperability Test Results form to report test results. Compete a form for each transaction of the test campaign.

    Attach system logs to the test result form. The logs can consist of printouts of received and transmitted APDUs, screen dumps illustrating the data from a request or indication service, and/or communication logs listing the events and actions occurring within the system during transaction processing.

    An example of a completed form for test case CASE154 is provided here.

    Elements of the Test Results form are:

    CASE NAME:
    Case name associated with the transaction.
    In the example, CASE154 was executed.
    PARTNER SYMBOL:
    Partner's institution-symbol, to provide a means of identifying the partner in the transaction.
    In the example, the partner's institution symbol is "SYMB2".
    TRANSACTION ID:
    Transaction's identifier, to provide a means of associating the Test Result form to the ILL transaction in the system under test.
    DATE STARTED:
    Date the transaction started.
    In our example, the transaction started on November 5, 1991.
    DATE COMPLETED:
    Date the transaction was completed.
    In our example the transaction was completed on November 6, 1991.
    EVENTS:
    Test events which occurred for the transaction. For each event specified in the TTCN test case, there will be a corresponding recorded test event.
    The recorded event "ILLreq" corresponds to the event "<IUT!ILLreq>" of CASE154, the recorded event "sent ILL" corresponds to the event "L!ILL" of CASE154, etc. Note that when the event involves APDUs, the action taken "sent" or "received" is recorded. For service primitives, the abbreviation implicitly defines the action taken.

    The recorded events will differ with the test case specification when an optional APDU is not transmitted. In this situation, the initiator of the request service invokes the STRreq service and the partner receives only the STR APDU (an STRind) and not the specified indication service.

    The sample Test Results form illustrates two such situations. The RET APDU is not sent when the RETreq is invoked. Note that no event "sent RET" has been recorded. Instead the IUT operator has invoked the STRreq service which is followed by the "sent STR" event. The partner will only see the STR APDU and get an STRind. In this situation, the test case specification events "<IUT!RETreq>" and "L!RET" are replaced by the three events "RETreq", "STRreq", "send STR".

    The second situation is illustrated by the recorded events "received STR", "STRind (cHECKED-IN)". Here, the IUT received an STR APDU indicating that the partner invoked the CHKreq service without sending the optional APDU CHK. The IUT operator in this case records that the STR was received and got an STRind. The (cHECKED-IN) is the most-recent-service parameter value from Status-or-Error-Report indication. In this example, the test case specification events "L?CHK" and "<IUT?CHKind>" are replaced by the recorded events "received STR" and "STRind (cHECKED-IN)".

    PROBLEMS:
    Description of problems encountered during the course of transaction.
    In the example, no problems were encountered.
    RESOLUTION:
    Actions taken to resolve the problems recorded in the "PROBLEMS" section.

    INTEROPERABILITY TEST RESULTS

    CASE: __CASE154_______ PARTNER SYMBOL: __SYMB2______

    TRANSACTION ID: __1234567890_____________________

    Date Started: Nov. 5, 1991 Date Completed: Nov. 6, 1991

    EVENTS:

    1) _ILLreq__________________ 9) _received REA___________

    2) _sent ILL________________ 10) _REAind_________________

    3) _received SHI____________ 11) _RETreq_________________

    4) _SHIind__________________ 12) _STRreq_________________

    5) _RCVreq__________________ 13) _sent STR_______________

    6) _sent RCV________________ 14) _received STR___________

    7) _RENreq__________________ 15) _STRind (cHECKED-IN)____

    8) _sent REN________________ 16) _______________________

    PROBLEMS: __No problems encountered._____________________

    _________________________________________________________

    _________________________________________________________

    _________________________________________________________

    RESOLUTION: _____________________________________________

    _________________________________________________________

    _________________________________________________________

    _________________________________________________________

    INTEROPERABILITY TEST RESULTS ANALYSIS

    The test results provided by the implementations will be analyzed for completion of test cases and reviewed for any problems and resolutions. A summary test report, "System Interoperability Test Report", will be prepared for each participant. This report will provide statistics on tests run during interoperability testing and comments on problems that occurred during testing.

    ROLE OF THE TESTING COORDINATOR

    Tasks of the ILL Implementation Interoperability Testing Coordinator include:
    Test Setup:
  • Selection of test parties and creates the test plans for each implementation.
  • Gathering of the information required for completing the test plans from the participants.
  • Test Campaign:
    Provision of technical support in resolving problems which may occur during the test campaign.
  • Results Analysis:
  • Analysis of test results
  • Generation of the Interoperability Test Reports for each implementation.
  •       Top of page


    copyright © 1997
    Interlibrary Loan Application Standards Maintenance Agency