Raw Data
This file contains raw search retrieval results or agent logs. The content below shows the original markdown source.
---
layout: raw-data.njk
---
# Compliance Assets Plan: NZ Public Service API Development
**Date**: 2025-12-10
**Status**: Draft
**Purpose**: Define compliance assets that help public service API developers work efficiently while automatically producing compliance evidence.
---
## Overview of Compliance Landscape
The source materials contain approximately **7,106 structured document elements** across:
| Document | Elements | Percentage | Focus |
|----------|----------|------------|-------|
| Part A | 692 | 12.5% | API concepts and management |
| Part B | 1,699 | 30.5% | Technical specifications and security |
| Part C | 3,187 | 57% | Implementation guidance |
| Draft API Standard | 1,528 | - | Consolidated requirements |
### Normative Language (RFC 2119)
- **MUST/MUST NOT**: Absolute requirements (mandatory compliance)
- **SHOULD/SHOULD NOT**: Strong recommendations (deviation requires justification)
- **MAY**: Truly optional guidance
All requirements include **DocRef citations** for traceability to source documents.
---
## Proposed Compliance Assets
### 1. API Requirements Registry (Foundational Dataset)
**Purpose**: A structured, queryable dataset extracting all normative requirements from source documents.
**Format**: JSON/CSV dataset with fields:
| Field | Description |
|-------|-------------|
| `requirement_id` | Unique identifier |
| `requirement_text` | The actual requirement statement |
| `normative_level` | MUST \| SHOULD \| MAY |
| `lifecycle_phase` | Design \| Development \| Security \| Deployment \| Operations |
| `api_category` | System \| Process \| Experience (or "all") |
| `api_exposure` | Internal \| Partner \| External \| Public (or "all") |
| `docref_url` | Traceable citation to source |
| `docref_id` | GHI for the source element |
| `tags` | Classification tags (security, documentation, versioning, etc.) |
**Compliance Evidence Produced**:
- Requirement selection logs (which requirements apply to this API)
- Coverage reports (percentage of applicable requirements addressed)
- Gap analysis datasets
---
### 2. Compliance Checklist Generator
**Purpose**: Generate context-specific checklists based on API characteristics.
**Inputs**:
- API category (System/Process/Experience)
- Exposure level (Internal/Partner/External/Public)
- Lifecycle phase (Design/Development/etc.)
- Technology choices (REST/GraphQL/AsyncAPI/gRPC)
**Output**: Filtered checklist of applicable requirements with:
- Checkbox completion status
- Evidence attachment capability
- DocRef citations for each item
- Timestamp logging
**Compliance Evidence Produced**:
- Completed checklists with timestamps
- Evidence attachments per requirement
- Sign-off records
- Audit trail of checklist generation parameters
---
### 3. OpenAPI Specification Validator
**Purpose**: Automated validation of OpenAPI specs against NZ API Standard requirements.
**Validation Rules** (mapped from source documents):
- HTTP method usage patterns
- Response code requirements
- Security scheme definitions
- Pagination implementation
- Error response structures
- Versioning patterns in URLs/headers
- Header requirements (Content-Type, Accept, etc.)
- Path naming conventions
**Output Format**:
```json
{
"api_name": "example-api",
"version": "1.0.0",
"validation_date": "2025-12-10T...",
"compliance_score": 94,
"results": [
{
"requirement_id": "SEC-001",
"status": "PASS",
"docref": "https://docref.digital.govt.nz/...",
"evidence": "Security scheme 'oauth2' defined"
},
{
"requirement_id": "ERR-003",
"status": "FAIL",
"docref": "https://docref.digital.govt.nz/...",
"message": "Missing 'correlationId' in error responses",
"remediation": "Add correlationId field to error schemas"
}
]
}
```
**Compliance Evidence Produced**:
- Validation reports per API version
- Historical compliance tracking
- Remediation logs
- CI/CD integration records
---
### 4. Security Requirements Decision Matrix
**Purpose**: Guide security implementation choices with full traceability.
**Structure**:
| Decision Point | Options | When to Use | Requirements Satisfied | DocRef |
|---------------|---------|-------------|----------------------|--------|
| Authentication | API Key | Internal, low-risk | SEC-101, SEC-102 | [DocRef](...) |
| Authentication | OAuth 2.0 Client Credentials | Server-to-server | SEC-103, SEC-104 | [DocRef](...) |
| Authentication | OAuth 2.0 Authorization Code + PKCE | User delegation | SEC-105, SEC-106 | [DocRef](...) |
| TLS Version | TLS 1.2+ | All APIs | SEC-201 | [DocRef](...) |
| Token Lifetime | Short-lived + refresh | User sessions | SEC-301 | [DocRef](...) |
**Compliance Evidence Produced**:
- Security architecture decision records
- Requirement coverage by security choices
- Justification log for pattern selection
---
### 5. API Lifecycle Compliance Tracker
**Purpose**: Track compliance status as APIs progress through lifecycle phases.
**Phase Gates**:
1. **Design Gate**: Specification complete, consumer co-design documented
2. **Development Gate**: Implementation matches spec, tests written
3. **Security Gate**: Security review complete, vulnerability scan passed
4. **Deployment Gate**: Documentation published, catalogue entry created
5. **Operations Gate**: Monitoring configured, SLA defined
**Data Model**:
```json
{
"api_id": "agency-customer-api",
"current_phase": "deployment",
"gates": {
"design": {
"status": "passed",
"date": "2025-10-15",
"evidence": ["spec-review.json", "codesign-notes.md"],
"approver": "jane.smith@agency.govt.nz"
},
"security": {
"status": "passed",
"date": "2025-11-20",
"evidence": ["security-review-report.pdf", "pentest-results.json"],
"requirements_satisfied": ["SEC-001", "SEC-002", "..."]
}
}
}
```
**Compliance Evidence Produced**:
- Gate passage records with timestamps
- Evidence attachment logs
- Approval chains
- Full audit trail per API
---
### 6. Test Specification Library
**Purpose**: Pre-built test specifications that map directly to requirements.
**Categories**:
- **Contract Tests**: Validate API matches OpenAPI specification
- **Security Tests**: Verify authentication, authorization, TLS
- **Error Handling Tests**: Confirm error responses meet format requirements
- **Performance Tests**: Validate against SLA requirements
- **Versioning Tests**: Ensure version headers/URLs work correctly
**Example Test Specification**:
```yaml
test_id: SEC-AUTH-001
requirement_ref: SEC-103
docref: https://docref.digital.govt.nz/nz/dia-apis-partb/2022/en/#part2-section3
description: "API MUST reject requests without valid authentication"
test_steps:
- action: "Send request without Authorization header"
expected: "401 Unauthorized response"
- action: "Send request with invalid token"
expected: "401 Unauthorized with WWW-Authenticate header"
```
**Compliance Evidence Produced**:
- Test execution results linked to requirements
- Coverage reports (requirements → tests → results)
- Regression test history
- Automated compliance certification
---
### 7. Documentation Templates
**Purpose**: Structured templates ensuring all required documentation elements are included.
**Templates**:
- **API Overview Documentation** (required elements from Part C guidance)
- **Getting Started Guide** (onboarding steps)
- **Authentication Guide** (security implementation details)
- **Error Handling Reference** (standard error codes and formats)
- **Changelog Template** (versioning and deprecation notices)
**Template Structure** (example excerpt):
```markdown
# {API_NAME} - API Documentation
<!-- COMPLIANCE: DOC-001 - API documentation MUST be published -->
<!-- DocRef: https://docref.digital.govt.nz/nz/dia-apis-partc/2022/en/#part1-section2-para61 -->
## Overview
[Description of what the API does]
## Authentication
<!-- COMPLIANCE: DOC-002 - Authentication methods MUST be documented -->
<!-- DocRef: https://docref.digital.govt.nz/... -->
[Authentication details]
## Endpoints
<!-- COMPLIANCE: DOC-003 - All endpoints MUST be documented with examples -->
[Endpoint documentation]
```
**Compliance Evidence Produced**:
- Documentation completeness checklists
- Template compliance markers in source
- Publication records
- Version history
---
### 8. Design Decision Log Framework
**Purpose**: Structured capture of design decisions, especially deviations from SHOULD requirements.
**Decision Record Format**:
```json
{
"decision_id": "DDR-2025-042",
"api_id": "customer-api-v2",
"date": "2025-11-15",
"decision_type": "deviation",
"requirement_id": "REST-012",
"requirement_text": "APIs SHOULD use HATEOAS links",
"docref": "https://docref.digital.govt.nz/...",
"decision": "Not implementing HATEOAS for this API",
"rationale": "Consumer feedback indicates mobile clients prefer simpler responses; HATEOAS would increase payload size by 40%",
"alternatives_considered": ["Full HATEOAS", "Partial links only"],
"risk_assessment": "Low - consumers have documentation access",
"approver": "tech-lead@agency.govt.nz",
"review_date": "2026-11-15"
}
```
**Compliance Evidence Produced**:
- Audit trail of all deviations
- Rationale documentation for assessors
- Periodic review triggers
- Decision search/filter capability
---
### 9. API Gateway Policy Templates
**Purpose**: Pre-configured gateway policies that enforce mandatory requirements.
**Policy Categories**:
- **Authentication Enforcement** (reject unauthenticated requests)
- **Rate Limiting** (protect backend systems)
- **TLS Enforcement** (minimum TLS 1.2)
- **Header Validation** (required headers present)
- **Request Size Limits** (prevent DoS)
- **CORS Configuration** (for browser-based consumers)
- **Logging Configuration** (audit trail requirements)
**Example Policy (pseudocode)**:
```yaml
policy_id: GATEWAY-SEC-001
requirement_refs: [SEC-101, SEC-102, SEC-103]
docrefs:
- https://docref.digital.govt.nz/nz/dia-apis-partb/2022/en/#part2-section1
rules:
- name: enforce_tls
condition: request.protocol != "HTTPS"
action: reject(426, "Upgrade Required")
- name: require_auth
condition: request.headers["Authorization"] == null
action: reject(401, "Unauthorized")
exceptions:
- path: /health
- path: /public/*
```
**Compliance Evidence Produced**:
- Gateway enforcement logs
- Policy deployment records
- Request rejection statistics
- Continuous compliance monitoring
---
### 10. Compliance Dashboard Schema
**Purpose**: Data model for aggregating compliance status across multiple APIs.
**Schema**:
```json
{
"organisation_id": "agency-xyz",
"reporting_period": "2025-Q4",
"apis": [
{
"api_id": "customer-api",
"compliance_summary": {
"must_requirements": {"total": 45, "satisfied": 45, "percentage": 100},
"should_requirements": {"total": 32, "satisfied": 28, "deviations": 4},
"overall_score": 96
},
"lifecycle_status": "production",
"last_assessment": "2025-12-01",
"open_issues": 2,
"evidence_links": ["..."]
}
],
"aggregated_metrics": {
"total_apis": 12,
"average_compliance": 94.2,
"common_gaps": ["Error correlation IDs", "HATEOAS implementation"]
}
}
```
**Compliance Evidence Produced**:
- Organisation-wide compliance reports
- Trend analysis over time
- Comparative benchmarking
- Audit-ready summaries
---
## Implementation Priority
| Priority | Asset | Effort | Impact | Evidence Quality |
|----------|-------|--------|--------|-----------------|
| 1st | API Requirements Registry | Medium | High | Foundation for all other assets |
| 2nd | OpenAPI Spec Validator | Medium | High | Automated, continuous |
| 3rd | Compliance Checklist Generator | Low | High | Direct traceability |
| 4th | Test Specification Library | Medium | High | Automated testing = evidence |
| 5th | Security Decision Matrix | Low | Medium | Decision support + record |
| 6th | Documentation Templates | Low | Medium | Embedded compliance markers |
| 7th | API Lifecycle Tracker | Medium | Medium | Process evidence |
| 8th | Design Decision Log | Low | Medium | Deviation justification |
| 9th | Gateway Policy Templates | High | High | Continuous enforcement |
| 10th | Compliance Dashboard | Medium | Medium | Aggregated reporting |
---
## Key Principle: Evidence Through Use
All proposed assets follow the principle that **compliance evidence is produced as a byproduct of normal usage**:
| Activity | Evidence Produced |
|----------|------------------|
| Validator runs | Validation reports saved automatically |
| Checklists completed | Completion records with timestamps |
| Tests executed | Results mapped to requirements |
| Documentation written | Embedded compliance markers tracked |
| APIs deployed | Gateway logs capture enforcement |
| Decisions made | Decision records created in workflow |
This means teams don't need to do separate compliance documentation work—the artifacts they create while building APIs *are* the compliance evidence.
---
## Next Steps
1. **Build the API Requirements Registry** - Extract all MUST/SHOULD/MAY requirements from source documents with DocRef citations
2. **Develop validation rules** - Map requirements to automated checks
3. **Create initial templates** - Documentation and decision log templates
4. **Prototype the validator** - OpenAPI spec validation against requirements
---
## Source Documents
All compliance assets trace back to these authoritative sources:
- [Draft API Standard](https://docref.digital.govt.nz/nz/dia/nz-api-standard/draft/en/)
- [Part A: API Concepts and Management](https://docref.digital.govt.nz/nz/dia-apis-parta/2022/en/)
- [Part B: API Security](https://docref.digital.govt.nz/nz/dia-apis-partb/2022/en/)
- [Part C: API Development](https://docref.digital.govt.nz/nz/dia-apis-partc/2022/en/)