Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion .cspell.json
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,9 @@
"pagetitle",
"dcterms",
"Blockquotes",
"tracematrix"
"tracematrix",
"testname",
"filepart"
],
"ignorePaths": [
"node_modules",
Expand Down
22 changes: 16 additions & 6 deletions AGENTS.md
Original file line number Diff line number Diff line change
Expand Up @@ -121,8 +121,12 @@ ReqStream/
- Other markdown files use link references: `[text][ref]` with `[ref]: url` at end
- **Linting**:
- **Markdown**: Must pass markdownlint (max line length: 120 chars)
- Lists must be surrounded by blank lines (MD032)
- Run locally: Check CI workflow for markdownlint-cli2-action usage
- **Spell Check**: Must pass cspell (custom dictionary in `.cspell.json`)
- Add project-specific terms to the custom dictionary if needed
- **YAML**: Must pass yamllint (2-space indentation, max line length: 120 chars)
- **All linting must pass locally before committing** - CI will reject changes with linting errors

## CI/CD Pipelines

Expand All @@ -143,12 +147,18 @@ dotnet pack --no-build --configuration Release

## Pre-Finalization Quality Checks

Before completing any task, perform these checks in order:

1. **Build and Test**: Run `dotnet build --configuration Release && dotnet test --configuration Release`
2. **Code Review**: Use `code_review` tool and address valid concerns
3. **Security Scanning**: Use `codeql_checker` tool after code review
4. **Linting**: Ensure markdown, spell check, and YAML linting pass (runs in CI)
Before completing any task, you **MUST** perform these checks in order and ensure they all pass:

1. **Build and Test**: Run `dotnet build --configuration Release && dotnet test --configuration Release` - all tests
must pass with zero warnings
2. **Code Review**: Use `code_review` tool and address all valid concerns
3. **Security Scanning**: Use `codeql_checker` tool after code review - must report zero vulnerabilities
4. **Linting**: **MANDATORY** - Run all linters locally and fix any issues before pushing changes:
- **Markdown**: Run markdownlint on all changed `.md` files - must pass with zero errors
- **Spell Check**: Run cspell on all changed files - must pass with zero errors
- **YAML**: Run yamllint on all changed `.yaml` or `.yml` files - must pass with zero errors
- These linters run in CI and will fail the build if not passing
- **DO NOT** rely solely on CI to catch linting issues - catch them locally first

## Project-Specific Guidelines

Expand Down
41 changes: 41 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -157,9 +157,50 @@ mappings:
- **Child Requirements**: Requirements can reference other requirements as children using the `children` field
- **Test Mappings**: Tests can be mapped to requirements either inline (within the requirement definition) or
separately (using the `mappings` section)
- **Test Source Linking**: Support for source-specific test matching using the `[filepart@]testname` pattern,
allowing requirements to specify tests from specific result files (e.g., `windows-latest@MyTest`)
- **File Includes**: Use the `includes` section to reference other YAML files containing additional requirements
or test mappings

### Test Source Linking

When testing requirements across multiple platforms or configurations, test result files often include platform
identifiers in their names (e.g., `test-results-windows-latest.trx`, `test-results-ubuntu-latest.junit.xml`).
Test source linking allows requirements to specify which test results should come from which source files.

**Pattern**: `[filepart@]testname`

- `filepart` (optional): A substring that matches the base filename (without extension) of the test result file.
Matching is case-insensitive and supports partial matches.
- `testname`: The exact name of the test as it appears in the test result file.

**Examples**:

```yaml
requirements:
- id: "PLAT-001"
title: "Shall support Windows"
tests:
- "windows-latest@Test_PlatformFeature" # Matches only from files containing "windows-latest"

- id: "PLAT-002"
title: "Shall support Linux"
tests:
- "ubuntu-latest@Test_PlatformFeature" # Matches only from files containing "ubuntu-latest"

- id: "PLAT-003"
title: "Shall support cross-platform features"
tests:
- "Test_CrossPlatformFeature" # Aggregates from all test result files
```

**Key behaviors**:

- Tests with source specifiers only match results from files containing the specified `filepart`
- Tests without source specifiers aggregate results from all test result files
- File part matching is case-insensitive and supports partial filename matching
- Both plain and source-specific test names can be mixed in the same requirement

## Development

### Requirements
Expand Down
74 changes: 74 additions & 0 deletions docs/guide/guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -237,6 +237,54 @@ mappings:
The separate `mappings` section is useful when test mappings are maintained by a different team or in a different
file from the requirements.

### Test Source Linking

When testing requirements across multiple platforms or configurations, use test source linking to distinguish tests
from different sources. This is particularly useful for matrix testing scenarios.

**Pattern**: `[filepart@]testname`

- `filepart` (optional): A substring matching the base filename (without extension) of the test result file
- `testname`: The exact test name from the test result file

**Example - Platform-specific testing:**

```yaml
requirements:
- id: "PLAT-001"
title: "Shall support Windows operating systems"
tests:
- "windows-latest@Test_PlatformBasic"
- "windows-latest@Test_FileSystem"

- id: "PLAT-002"
title: "Shall support Linux operating systems"
tests:
- "ubuntu-latest@Test_PlatformBasic"
- "ubuntu-latest@Test_FileSystem"

- id: "PLAT-003"
title: "Shall support cross-platform APIs"
tests:
- "Test_CrossPlatformAPI" # Aggregates from all platforms
```

With test result files:

- `test-results-windows-latest.trx`
- `test-results-ubuntu-latest.trx`
- `test-results-macos-latest.trx`

The `windows-latest@Test_PlatformBasic` test will only match results from files containing "windows-latest" in their
base filename. The `Test_CrossPlatformAPI` test without a source specifier will aggregate results from all three
files.

**Key features:**

- Case-insensitive matching: `windows@Test` matches `test-results-WINDOWS-latest.trx`
- Partial matching: `ubuntu@Test` matches `test-results-ubuntu-22.04-latest.trx`
- Plain test names: Tests without `filepart@` prefix aggregate results from all test result files

### File Includes

Large projects can be split across multiple YAML files using the `includes` section:
Expand Down Expand Up @@ -662,6 +710,32 @@ trace matrix.
A: No, test names must be specified exactly. However, you can use patterns when specifying test result files
(`--tests "**/*.trx"`).

**Q: How can I link tests to specific test result files?**

A: Use the test source linking feature with the `[filepart@]testname` pattern. The `filepart` is a substring that
matches the base filename (without extension) of the test result file. For example:

```yaml
requirements:
- id: "WIN-001"
title: "Shall support Windows"
tests:
- "windows-latest@Test_PlatformFeature" # Matches only from files containing "windows-latest"

- id: "LIN-001"
title: "Shall support Linux"
tests:
- "ubuntu-latest@Test_PlatformFeature" # Matches only from files containing "ubuntu-latest"
```

File part matching is case-insensitive and supports partial matches. Tests without the `filepart@` prefix aggregate
results from all test result files.

**Q: Can I mix plain and source-specific test names?**

A: Yes, you can mix both styles in the same requirement. Plain test names will aggregate results from all test result
files, while source-specific test names will only match their specified sources.

### Export Questions

**Q: Can I customize the markdown format of reports?**
Expand Down
26 changes: 26 additions & 0 deletions requirements.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -120,6 +120,32 @@ sections:
- "Create_WithTestsPattern_ExpandsGlobPattern"
- "TraceMatrix_WithMixedFormats_ProcessesBoth"

- id: "TEST-005"
title: "The tool shall support source-specific test matching using filepart@testname pattern."
tests:
- "TraceMatrix_WithSourceSpecificTests_MatchesCorrectly"
- "TraceMatrix_WithSourceSpecificTests_DoesNotMatchOtherSources"

- id: "TEST-006"
title: "The tool shall perform case-insensitive matching of file parts in source-specific test names."
tests:
- "TraceMatrix_WithSourceSpecificTests_IsCaseInsensitive"

- id: "TEST-007"
title: "The tool shall support partial filename matching for source-specific tests."
tests:
- "TraceMatrix_WithSourceSpecificTests_MatchesPartialFilename"

- id: "TEST-008"
title: "The tool shall support plain test names that aggregate from all sources."
tests:
- "TraceMatrix_WithPlainTestNames_MatchesAllSources"

- id: "TEST-009"
title: "The tool shall support mixing plain and source-specific test names in the same requirement."
tests:
- "TraceMatrix_WithMixedTestNames_MatchesAppropriately"

- title: "Reporting"
requirements:
- id: "RPT-001"
Expand Down
73 changes: 68 additions & 5 deletions src/DemaConsulting.ReqStream/TraceMatrix.cs
Original file line number Diff line number Diff line change
Expand Up @@ -443,6 +443,9 @@ private void ProcessTestResultFile(string filePath, HashSet<string> requiredTest
throw new FileNotFoundException($"Test result file not found: {filePath}", filePath);
}

// Extract the base filename (without extension) for source matching
var fileBaseName = Path.GetFileNameWithoutExtension(filePath);

// Read the file content
var content = File.ReadAllText(filePath);

Expand All @@ -468,17 +471,18 @@ private void ProcessTestResultFile(string filePath, HashSet<string> requiredTest
// Process each test result
foreach (var result in testResults.Results)
{
// Only process tests that are referenced in requirements
if (!requiredTests.Contains(result.Name))
// Check if any required test matches this result
var matchingTestName = FindMatchingTestName(requiredTests, result.Name, fileBaseName);
if (matchingTestName == null)
{
continue;
}

// Get or create the test result entry
if (!_testResults.TryGetValue(result.Name, out var entry))
// Get or create the test result entry using the full test name from requirements
if (!_testResults.TryGetValue(matchingTestName, out var entry))
{
entry = new TestResultEntry();
_testResults[result.Name] = entry;
_testResults[matchingTestName] = entry;
}

// Update execution counts
Expand All @@ -490,4 +494,63 @@ private void ProcessTestResultFile(string filePath, HashSet<string> requiredTest
}
}
}

/// <summary>
/// Finds a matching test name from the required tests set.
/// Supports both plain test names and source-specific test names with the pattern: [filepart@]testname.
/// </summary>
/// <param name="requiredTests">Set of test names from requirements.</param>
/// <param name="actualTestName">The actual test name from the test result file.</param>
/// <param name="fileBaseName">The base name of the test result file (without extension).</param>
/// <returns>The matching test name from requirements, or null if no match found.</returns>
private static string? FindMatchingTestName(HashSet<string> requiredTests, string actualTestName, string fileBaseName)
{
// First, try to find an exact match with source specifier: <filepart>@<testname>
// Check if any filepart from the base name matches
foreach (var requiredTest in requiredTests)
{
var (filePart, testName) = ParseTestName(requiredTest);

// If there's a file part, check if it matches and the test name matches
if (filePart != null &&
fileBaseName.Contains(filePart, StringComparison.OrdinalIgnoreCase) &&
testName == actualTestName)
{
return requiredTest;
}
}

// Second, try to find a plain test name match (no source specifier)
foreach (var requiredTest in requiredTests)
{
var (filePart, testName) = ParseTestName(requiredTest);

// If there's no file part and the test name matches
if (filePart == null && testName == actualTestName)
{
return requiredTest;
}
}

return null;
}

/// <summary>
/// Parses a test name to extract the optional file part and the actual test name.
/// Format: [filepart@]testname
/// </summary>
/// <param name="testName">The test name from requirements.</param>
/// <returns>A tuple of (filePart, testName). filePart is null if not specified.</returns>
private static (string? filePart, string testName) ParseTestName(string testName)
{
var atIndex = testName.IndexOf('@');
if (atIndex > 0 && atIndex < testName.Length - 1)
{
var filePart = testName[..atIndex];
var actualTestName = testName[(atIndex + 1)..];
return (filePart, actualTestName);
}

return (null, testName);
}
}
Loading
Loading