Performance comparison

Oct 29, 2011 at 2:01 PM

I got an SQL SELECT statement of a large WHERE clause with 688 predicates from a client (production code).

Parsing it with Irony takes 55 ms; with ANTLR  generated Lexer/Parser (C#) takes 40 ms; and with Parser inside SQL Server 2008 R2 takes 17 ms.

Is there any way to tune Irony's performance?


Oct 29, 2011 at 5:09 PM

55 ms seems high; first of all, make sure environment/Irony is "warmed-up" when you run the test - are you sure parser construction time is not included? Create a parser object then run parse 100 times to measure perf. Make sure you don't introduce any extra hits like AST construction or custom code/terminals that do something unreasonable. If it's possible send me this file directly and I can play with it and see where's the bottleneck - that would actually help a lot, we can discover some loophole that leaks performance. 

I planned to start looking into this perf tuning, other folks already reported that Gold/Antlr run faster in some cases. See other recent discussion with Alexander Mutel.  It is mainly scanner - most of the time is spent there (in Irony and in any other parser framework). 

I'm thinking about making a special facility for fast scanning simplified versions of terminals that falls back to full-scan mode when it hits some exotic terminal arrangement.  

Dec 12, 2011 at 3:57 PM

Using the Performance Wizard in Visual Studio 2010, I can see the bottleneck lies in CoreParser.ReadInput(), which spends most time in calling FetchToekn().

Dec 12, 2011 at 4:01 PM

that's no surprise, FetchToken call Scanner to produce token, and it is well known that parsers in general spend most of their time in scanner. By the way, I just upgraded scanner, it should run much faster, the code is in Source Code page (in depo), not in Downloads zip yet. Try this new version, let me know the result. 



Dec 12, 2011 at 6:09 PM

The new version is about the same in performance for my test case. However, the bottleneck is changed to Scanner.MatchTerminals().