Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

(it’s actually also straightforward to write a recursive descent parser directly on the char stream w/o a lexing step.)


how? it's not like you can magically skip the actual tokenization. you're basically saying you can do lexical analysis and semantic analysis in the same function. sure but that makes the code that much hairier - there's a reason why they're typically factored into a lexer and a parser.


> you're basically saying you can do lexical analysis and semantic analysis in the same function.

yes

> that makes the code that much hairier

true, but it does save you the rigamarole of the lexer.

    func parseDeclaration(s string, i0 Pos) (n Node, i Pos, err error) {
        i = i0
        i = skipSpace(s, i)
        if word(s, i, "var") {
            v := &ast.VarDecl{VarPos: i}
            i += 3
            i = skipSpace(s, i)
            if v.Ident, i, err = parseIdent(s, i); err != nil {
                return
            }
            i = skipSpace(s, i)
            if is(s, i, "=") {
                i += 1
                i = skipSpace(s, i)
                if v.Value, i, err = parseExpr(s, i); err != nil {
                    return
                }
            }
            if !is(s, i, ";") {
                return nil, i, fmt.Errorf(`unexpected input or EOF (expected ";")`)
            }
            ⋮


Possible sure, but without a combinator framework, you have to do a ton of the rote work yourself.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: