class PgSqlLexer::Lexer

Overview

A lexer is a tool that analyzes a stream of text and generates a list of tokens that have been identified in that stream. This simple class is designed to take a string representing some SQL that has already been identified as syntactically correct by the postgres DB. For example it may be some SQL from a log file (my use case).

To instantiate an instance of this class you pass a string with the SQL to parse into tokens. You can optionally control the sets of words that the lexer recognises as keywords. Refer to the Keyword class, but basically there are two sets of keywords - reserved and non-reserved. By default the lexer will only use the 'reserved' set of keywords but you can change this via arguments to the constructor.

As part of instantiation the string will be parsed and if no errors are encounted the tokens property will contain all tokens encountered during this process.

The parsing rules have been mostly derived from the Postgres Docs.

Here is an example of this class being used:

raw_sql = {slurp from a file maybe}
minified = PgSqlLexer::Formatter.new(PgSqlLexer::Lexer.new(raw_sql).tokens).format_minified
:

Defined in:

pg_sql_lexer/lexer.cr

Constructors

Constructor Detail

def self.new(buffer : String, use_reserved_keywords : Bool = true, use_non_reserved_keywords : Bool = false) #

[View source]