Skip to content

feat(rules): implement pattern string lexer for tokenization#4

Merged
fohte merged 3 commits intomainfrom
fohte/impl-runok-init-task-2-1
Feb 8, 2026
Merged

feat(rules): implement pattern string lexer for tokenization#4
fohte merged 3 commits intomainfrom
fohte/impl-runok-init-task-2-1

Conversation

@fohte
Copy link
Owner

@fohte fohte commented Feb 8, 2026

Why

  • The pattern parser requires a lexer as its first stage to split pattern strings (e.g. "curl -X|--request POST *") into raw tokens
    • The parser cannot recognize syntactic elements (alternation, wildcards, negation) in rule patterns without a lexer

What

  • Add a pattern string lexer (pattern_lexer) to the rules module
    • Recognize alternation (-X|--request), wildcards (*), negation (!value), optional groups ([...]), placeholders (<cmd>, <path:name>), and quoted strings
    • Return errors for malformed syntax (unclosed brackets, nested square brackets, empty alternation parts)

Open with Devin

fohte and others added 2 commits February 8, 2026 03:14
The pattern parser needs a lexer that splits pattern strings like
"curl -X|--request POST *" into raw tokens as its first stage.
Without this, the parser cannot recognize the different syntactic
elements (alternation, wildcards, negation, etc.) in rule patterns.

The lexer recognizes:
- Pipe-separated alternation (-X|--request, POST|PUT|PATCH)
- Wildcard (*) and negation (!value, !a|b|c)
- Optional group brackets ([...]) with nesting validation
- Angle-bracket placeholders (<cmd>, <path:name>)
- Quoted strings (single and double) with proper whitespace handling
- Error reporting for unclosed brackets, quotes, and invalid syntax

Co-Authored-By: Claude Opus 4.6 <[email protected]>
The lexer accepted patterns like "a||b", "-X|", "|--request" without
error, producing Alternation tokens with empty strings. This silently
ignored malformed patterns that would cause incorrect matching later.

Validate that all pipe-separated parts are non-empty and return
EmptyAlternation error for both regular and negated alternations.

Co-Authored-By: Claude Opus 4.6 <[email protected]>
@gemini-code-assist
Copy link

Summary of Changes

Hello @fohte, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request introduces the foundational pattern_lexer module, a critical first step for the upcoming pattern parser in the rules component. The lexer is responsible for breaking down complex pattern strings into a stream of recognizable tokens, enabling the parser to understand syntactic elements such as alternations, wildcards, negations, and optional groups. This ensures that rule patterns can be accurately interpreted and processed.

Highlights

  • New pattern_lexer module: Introduced a new pattern_lexer module to the rules component, which is the first stage for parsing pattern strings.
  • LexToken Enum Definition: Defined the LexToken enum to represent various syntactic elements like literals, alternations, wildcards, negations, optional groups, and placeholders.
  • Tokenization Logic: Implemented the core tokenize function, which processes pattern strings and converts them into a sequence of LexTokens, handling different pattern constructs.
  • Robust Error Handling: Incorporated robust error handling for malformed patterns, including unclosed quotes, unclosed angle brackets, unclosed square brackets, nested square brackets, and empty alternation parts.
  • Comprehensive Unit Tests: Added comprehensive unit tests to validate the lexer's behavior across various valid and erroneous pattern string scenarios, ensuring correctness and reliability.
Changelog
  • src/rules/mod.rs
    • Exported the new pattern_lexer module to make it publicly accessible within the rules component.
  • src/rules/pattern_lexer.rs
    • Added LexToken enum for representing different token types found in pattern strings.
    • Implemented the tokenize function for converting raw pattern strings into a structured sequence of LexTokens.
    • Included helper functions such as consume_word, is_word_boundary, classify_word, classify_negation, and validate_alternation_parts to support the tokenization process.
    • Integrated comprehensive unit tests covering basic literals, alternations, wildcards, negations, square brackets, placeholders, quoted strings, complex patterns, whitespace handling, and various error conditions.
Activity
  • fohte created this pull request to implement the pattern string lexer, laying the groundwork for future pattern parsing capabilities.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@devin-ai-integration devin-ai-integration bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✅ Devin Review: No Issues Found

Devin Review analyzed this PR and found no potential bugs to report.

View in Devin Review to see 3 additional findings.

Open in Devin Review

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a new lexer for rule patterns, which is a crucial first step for the pattern parser. The implementation is well-structured and comes with a comprehensive test suite covering various cases, including literals, alternations, wildcards, negations, and more.

I've added a couple of suggestions to improve the code further by reducing duplication and using more idiomatic Rust constructs. Overall, this is a solid contribution.

Address review feedback:
- Quoted string and angle bracket parsing shared identical
  consume-until-delimiter logic. Extract into `consume_until` helper.
- `validate_alternation_parts` now uses `collect::<Result<...>>` to
  short-circuit on the first empty part instead of collecting all parts
  before validation.

Co-Authored-By: Claude Opus 4.6 <[email protected]>
@fohte fohte merged commit 397e6d8 into main Feb 8, 2026
2 checks passed
@fohte fohte deleted the fohte/impl-runok-init-task-2-1 branch February 8, 2026 04:35
@fohte-bot fohte-bot bot mentioned this pull request Feb 8, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant