-
-
Notifications
You must be signed in to change notification settings - Fork 443
feat: switch to stdlib iterators #1144
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Draft
alecthomas
wants to merge
2
commits into
master
Choose a base branch
from
aat/stdlib-iterator
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Draft
Changes from 1 commit
Commits
Show all changes
2 commits
Select commit
Hold shift + click to select a range
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1,6 @@ | ||
module github.com/alecthomas/chroma/v2/cmd/chroma | ||
|
||
go 1.22 | ||
go 1.23 | ||
|
||
toolchain go1.25.1 | ||
|
||
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1,6 @@ | ||
module github.com/alecthomas/chroma/v2/cmd/chromad | ||
|
||
go 1.22 | ||
go 1.23 | ||
|
||
toolchain go1.25.1 | ||
|
||
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1,6 @@ | ||
module github.com/alecthomas/chroma/v2 | ||
|
||
go 1.22 | ||
go 1.23 | ||
|
||
require ( | ||
github.com/alecthomas/assert/v2 v2.11.0 | ||
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,57 +1,53 @@ | ||
package chroma | ||
|
||
import "strings" | ||
import ( | ||
"iter" | ||
"strings" | ||
) | ||
|
||
// An Iterator across tokens. | ||
// | ||
// EOF will be returned at the end of the Token stream. | ||
// | ||
// If an error occurs within an Iterator, it may propagate this in a panic. Formatters should recover. | ||
type Iterator func() Token | ||
type Iterator iter.Seq[Token] | ||
|
||
// Tokens consumes all tokens from the iterator and returns them as a slice. | ||
func (i Iterator) Tokens() []Token { | ||
var out []Token | ||
for t := i(); t != EOF; t = i() { | ||
for t := range i { | ||
if t == EOF { | ||
break | ||
} | ||
out = append(out, t) | ||
} | ||
return out | ||
} | ||
|
||
// Stdlib converts a Chroma iterator to a Go 1.23-compatible iterator. | ||
func (i Iterator) Stdlib() func(yield func(Token) bool) { | ||
return func(yield func(Token) bool) { | ||
for t := i(); t != EOF; t = i() { | ||
if !yield(t) { | ||
return | ||
} | ||
} | ||
} | ||
} | ||
|
||
// Concaterator concatenates tokens from a series of iterators. | ||
func Concaterator(iterators ...Iterator) Iterator { | ||
return func() Token { | ||
for len(iterators) > 0 { | ||
t := iterators[0]() | ||
if t != EOF { | ||
return t | ||
return func(yield func(Token) bool) { | ||
for _, it := range iterators { | ||
for t := range it { | ||
if t == EOF { | ||
break | ||
} | ||
if !yield(t) { | ||
return | ||
} | ||
} | ||
iterators = iterators[1:] | ||
} | ||
return EOF | ||
} | ||
} | ||
|
||
// Literator converts a sequence of literal Tokens into an Iterator. | ||
func Literator(tokens ...Token) Iterator { | ||
return func() Token { | ||
if len(tokens) == 0 { | ||
return EOF | ||
return func(yield func(Token) bool) { | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
|
||
for _, token := range tokens { | ||
if !yield(token) { | ||
return | ||
} | ||
} | ||
token := tokens[0] | ||
tokens = tokens[1:] | ||
return token | ||
} | ||
} | ||
|
||
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -69,63 +69,71 @@ func httpBodyContentTypeLexer(lexer Lexer) Lexer { return &httpBodyContentTyper{ | |
type httpBodyContentTyper struct{ Lexer } | ||
|
||
func (d *httpBodyContentTyper) Tokenise(options *TokeniseOptions, text string) (Iterator, error) { // nolint: gocognit | ||
var contentType string | ||
var isContentType bool | ||
var subIterator Iterator | ||
|
||
it, err := d.Lexer.Tokenise(options, text) | ||
if err != nil { | ||
return nil, err | ||
} | ||
|
||
return func() Token { | ||
token := it() | ||
return func(yield func(Token) bool) { | ||
var contentType string | ||
var isContentType bool | ||
var subIterator Iterator | ||
|
||
if token == EOF { | ||
if subIterator != nil { | ||
return subIterator() | ||
for token := range it { | ||
if token == EOF { | ||
break | ||
} | ||
return EOF | ||
} | ||
|
||
switch { | ||
case token.Type == Name && strings.ToLower(token.Value) == "content-type": | ||
{ | ||
isContentType = true | ||
} | ||
case token.Type == Literal && isContentType: | ||
{ | ||
isContentType = false | ||
contentType = strings.TrimSpace(token.Value) | ||
pos := strings.Index(contentType, ";") | ||
if pos > 0 { | ||
contentType = strings.TrimSpace(contentType[:pos]) | ||
switch { | ||
case token.Type == Name && strings.ToLower(token.Value) == "content-type": | ||
{ | ||
isContentType = true | ||
} | ||
} | ||
case token.Type == Generic && contentType != "": | ||
{ | ||
lexer := MatchMimeType(contentType) | ||
|
||
// application/calendar+xml can be treated as application/xml | ||
// if there's not a better match. | ||
if lexer == nil && strings.Contains(contentType, "+") { | ||
slashPos := strings.Index(contentType, "/") | ||
plusPos := strings.LastIndex(contentType, "+") | ||
contentType = contentType[:slashPos+1] + contentType[plusPos+1:] | ||
lexer = MatchMimeType(contentType) | ||
case token.Type == Literal && isContentType: | ||
{ | ||
isContentType = false | ||
contentType = strings.TrimSpace(token.Value) | ||
pos := strings.Index(contentType, ";") | ||
if pos > 0 { | ||
contentType = strings.TrimSpace(contentType[:pos]) | ||
} | ||
} | ||
case token.Type == Generic && contentType != "": | ||
{ | ||
lexer := MatchMimeType(contentType) | ||
|
||
// application/calendar+xml can be treated as application/xml | ||
// if there's not a better match. | ||
if lexer == nil && strings.Contains(contentType, "+") { | ||
slashPos := strings.Index(contentType, "/") | ||
plusPos := strings.LastIndex(contentType, "+") | ||
contentType = contentType[:slashPos+1] + contentType[plusPos+1:] | ||
lexer = MatchMimeType(contentType) | ||
} | ||
|
||
if lexer == nil { | ||
token.Type = Text | ||
} else { | ||
subIterator, err = lexer.Tokenise(nil, token.Value) | ||
if err != nil { | ||
panic(err) | ||
if lexer == nil { | ||
token.Type = Text | ||
} else { | ||
subIterator, err = lexer.Tokenise(nil, token.Value) | ||
if err != nil { | ||
panic(err) | ||
} | ||
// Emit tokens from the sub-iterator | ||
for st := range subIterator { | ||
if st == EOF { | ||
break | ||
} | ||
if !yield(st) { | ||
return | ||
} | ||
} | ||
continue | ||
} | ||
return EOF | ||
} | ||
} | ||
if !yield(token) { | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Given this is at the end of the function, I think this test is redundant. |
||
return | ||
} | ||
} | ||
return token | ||
}, nil | ||
} |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FWIW I don't see that this would be necessary: in fact I 'd strongly suggest it's not necessary to define an
EOF
token at all. Then this function can be as simple as:which to my mind somewhat calls into question whether the
Tokens
method is worth having at all.Maybe
Iterator
could be defined just as an alias:But then again, it's quite possibly worth preserving surface API compatibility even if the underlying representation changes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This thought had crossed my mind and I don't exactly recall why there's an EOF token TBH, but there was a good reason for it at some point, so I left it in.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FWIW the EOF token definitely seems a bit "surprising" to me. An analogy that springs to mind is that it feels a little like passing around length-delimited strings but still keeping a zero-byte at the end and asking everyone to ignore the last character.
I've not seen an example of this pattern before and I personally would think very carefully through (and explicitly document) the reasons for why it needs to be this way, assuming it really does.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It existed before to signify that the stream had reached EOF, which is obviously redundant with Go iterators.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FWIW I was aware of its previous use/need, but wondered if there was a reason you'd left it around when moving to the new iterator API.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah I see. No, this is basically a half hour PoC, not ready for merge. I was mostly pondering whether switching to iterators would be worth a major version bump. But in combination with some other cleanup, like eradicating EOF, it could be worthwhile I think.