class Stacklang::Parser

Defined in:

stacklang/parser.cr

Constant Summary

NEWLINE = "\n"
OPERATORS_PRIORITIES = [{Arity::Binary, Associativity::Left, ["."]}, {Arity::Binary, Associativity::Left, ["["]}, {Arity::Unary, nil, ["!", "~", "&", "*", "-"]}, {Arity::Binary, Associativity::Left, ["/", "*"]}, {Arity::Binary, Associativity::Left, ["+", "-"]}, {Arity::Binary, Associativity::Left, [">>", "<<"]}, {Arity::Binary, Associativity::Left, ["&"]}, {Arity::Binary, Associativity::Left, ["|", "^"]}, {Arity::Binary, Associativity::Left, ["==", "!="]}, {Arity::Binary, Associativity::Left, ["<", ">", "<=", ">="]}, {Arity::Binary, Associativity::Left, ["&&"]}, {Arity::Binary, Associativity::Left, ["||"]}, {Arity::Binary, Associativity::Right, ["=", "+=", "-="]}]

List of operator groups ordered by decreasing precedence. Mostly following crystal lang

Constructors

Class Method Summary

Instance Method Summary

Constructor Detail

def self.new(io : IO, filename : Nil | String = nil, fancy = true) #

[View source]

Class Method Detail

def self.open(path) #

[View source]

Instance Method Detail

def consume!(expected) : Token #

Consume a token and raise if it not equal to expected. Return the token.


[View source]
def consume?(token) #

Return true and consume the current token value if it is equal to token parameter. return false otherwise.


[View source]
def current #

Return the current token


[View source]
def eof? #

True if End Of File is reached.


[View source]
def expression #

[View source]
def expression_lexer : Array(AST::Expression | NamedTuple(kind: Arity, value: Tokenizer::Token)) #

Consume greedely all tokens of a single expression. Stop when it reach a ')', '}', ',', ']' It parse leaf expression as ast node, but keep operators as is in the order they appear. Return an array of all the component of the expression in their original order, mixing ast node and tuples in the form {kind: Symbol, value: Token} for operators.


[View source]
def expression_parser(chain : Array(AST::Expression | NamedTuple(kind: Arity, value: Tokenizer::Token))) : AST::Expression #

Take a chain of Expression ast nodes and unprocessed operators (as the output of #expression_lexer) and solve the operator precedence and associativity to produce one single Expression ast node.


[View source]
def function #

Parse and return a function.


[View source]
def global #

[View source]
def identifier(token = nil) #

Parse and return an identifier. If an identifier is given as a parameter, it use it instead of consuming a token.


[View source]
def literal(token = nil) #

[View source]
def next_token!(expected) : Token #

Get the next token value. Raise if none. Parameter expected is used to document the raised error.


[View source]
def number(token = nil) #

[View source]
def requirement #

[View source]
def statement #

[View source]
def structure #

[View source]
def syntax_error(expected, context = nil) : Exception #

Raise an error stating that the current token is unexpected. The expected parameter is used to document the error.


[View source]
def type_constraint(context_token : Token, colon = true, explicit = false) #

Context token is used to document syntax error in case a type AST node is generated implicitely


[View source]
def type_name(token = nil) #

parse and return a type name. If a type name is given as a parameter, it use it instead of consuming a token.


[View source]
def type_name?(token = nil) #

Try to parse and return a type name. If a type name is given as a parameter, it use it instead of consuming a token. If it fail, it return nil and does not alter the state.


[View source]
def unit #

[View source]