WhosUsingPyparsing

media type="custom" key="330399" =Contents:=


 * [|Coconut - Simple, elegant, Pythonic functional programming] ||  ||
 * Booleano - Boolean Condition Interpreter [[image:new.gif]] || [| Undebt - Massive, automated code refactoring] ||
 * Reverse Snowflake Joins || Graphite Real-time Graphing CLI ||
 * Freeode Project - SIML ODE language || svg2pycairo - Read SVG files and display using Cairo ||
 * asDox - Actionscript class extractor || svg2imagemap - SVG -> HTML image map converter ||
 * Quameon || Pybtex - BibTeX parser ||
 * Tunnelhack - text adventure || madlib - fiction generator web service ||
 * poetrygen - poetry generator || PyMLNs - Markov Logic Networks ||
 * dsniff - network monitor || Bauble - biodiversity database ||
 * Firebird PowerTool || Numbler - spreadsheet web service ||
 * Zhpy Chinese->English Preprocessor || Robot Instruction Assembler ||
 * ZMAS agent language || pymon Network Service Monitor ||
 * Simulink MDL parser || Cubulus OLAP - MDX parser ||
 * Django database schema migration utility || Python script trace visualizer ||
 * BLAST data file parser || pytnef ||
 * pylda || PNPD/Nova ||
 * SFE || Jppy ||
 * grailmud || ARFF2DB.py Python Cookbook Recipe ||
 * Java Thread Dump Parser || SQL-style LDAP query utility ||
 * bitfield || clipartbrowser ||
 * WURFL - Wireless Universal Resource File || CTL4J - Component Template Library for Java ||
 * XAQL - Xapian Query Language Parser || TAPPY - Tidal Analysis Package in PYthon ||
 * matplotlib || ldaptor ||
 * pydot || twill ||
 * BIND named.conf parser || Boa Constructor code conversion/upgrade utility ||
 * Process Stalker || MathDOM - MathML parser ||
 * Distribulator - distributed admin command parser || Spyse - Python Multi-Agent System Environment ||
 * FileSystemStorage SQL parser || Verilog parser ||

=Coconut - Simple, elegant, Pythonic functional programming= [|Coconut]is a functional programming language that compiles to Python. Coconut enhances the repertoire of Python programmers to include the tools of modern functional programming, in such a way that those tools are easy to use and immensely powerful; that is, Coconut does to functional programming what Python did to imperative programming.

Coconut uses pyparsing as the heart of its compiler, parsing and transforming Coconut source code into compiled version-independent Python entirely through the power of pyparsing. [|Coconut with Evan Hubinger (podcastinit.com)]

=Undebt - Massive, automated code refactoring= [|Undebt]is a tool built by Yelp.com for fast, straightforward, reliable code refactoring. [|Yelp successfully used Undebt to refactor their 3 million line code base].

Undebt patterns, which define what to replace when refactoring code, are powered by pure pyparsing; Undebt's entire library of patterns are all pyparsing objects.

[|Tech Debt and Refactoring at Yelp! with Andrew Mason (podcastinit.com)] =Docket Alarm - Legal Research "Connector" Language= [|Docket Alarm] uses pyparsing to support complex legal queries. While searching the law, lawyers regularly use a relatively advanced and unique search syntax, which is similar to Boolean searching, but includes a few additions such as stemming (using the ! mark) and proximity searching (using w/N operator). For example, the following query will find all documents that have the words "employment" or "employee", within ten words of the any word that starts with "discrim". code (employment or employee) w/10 discrim! code Full documentation for this query language is [|outlined in this article]. Currently, the library is not publicly released, but please contact [admin at docketalarm dot com] for more info if interested.

 =Booleano - A Generic Library for Interpreting Boolean Conditions= Gustavo Narea wrote [|Booleano] to support the inclusion of boolean expression processing as customizable library function. Gustavo gives various examples on his website, including multi-lingual examples. DON'T WRITE ANOTHER SEARCH QUERY PARSER! USE BOOLEANO!!!

 =Reverse Snowflake Joins - SQL statement grapher= Alexandru Toth wrote [|this utility] to graph the structure of SQL statements, the more easily to identify linkages and errors in complex SQL statements. Alexandru's utility parses this SQL statement: code SELECT film.film_id AS FID, film.title AS title, film.description AS description, category.name AS category, film.rental_rate AS price, film.length AS length, film.rating AS rating, GROUP_CONCAT(CONCAT(actor.first_name, _utf8' ', actor.last_name) SEPARATOR ', ') AS actors FROM category LEFT JOIN film_category ON   category.category_id = film_category.category_id LEFT JOIN film ON film_category.film_id = film.film_id JOIN film_actor ON film.film_id = film_actor.film_id JOIN actor ON film_actor.actor_id = actor.actor_id GROUP BY film.film_id; code

and generates this DOT code: code graph {   node [shape=record, fontsize=12]; graph [splines=true]; rankdir=LR;

CATEGORY [style=filled, fillcolor=white, label="CATEGORY | (CATEGORY) | category_id|name"]; FILM_CATEGORY [style=filled, fillcolor=white, label="FILM_CATEGORY | (FILM_CATEGORY) | category_id|  film_id"]; FILM_ACTOR [style=filled, fillcolor=white, label="FILM_ACTOR | (FILM_ACTOR) | actor_id| film_id"]; ACTOR [style=filled, fillcolor=white, label="ACTOR | (ACTOR) |  actor_id|first_name|last_name"]; FILM [style=filled, fillcolor=white, label="FILM | (FILM) | description| film_id|GROUP BY film_id|length|rating|rental_rate|title"];

ACTOR:actor_id -- FILM_ACTOR:actor_id [color = black arrowtail="none" arrowhead="none"]; CATEGORY:category_id -- FILM_CATEGORY:category_id [color = orange arrowtail="dot" arrowhead="none"]; FILM:film_id -- FILM_CATEGORY:film_id [color = orange arrowtail="none" arrowhead="dot"]; FILM:film_id -- FILM_ACTOR:film_id [color = black arrowtail="none" arrowhead="none"]; } code

which in turn creates this graph (using Pydot):

 =Graphite - Enterprise Real-time Graphing Engine Command Line Interface= Chris Davis wrote [|Graphite] for Orbitz to display real-time graphics for monitoring operational parameters. Graphite includes a [|command line interface] written in pyparsing, to parse and execute data sourcing and graphing commands (see [|screenshot]).

 =Freeode project - SIML language for solving ODE's= Eike Welk has developed the SIML language as part of the [|Freeode] project for solving differential equations. The SIML language is parsed using pyparsing to (among other things) convert this differential equation modeling code: code func dynamic: mu = mu_max * S/(S+Ks); #growth speed (of biomass) $X = mu*X;             #change of biomass concentration $S = -1/Yxs*mu*X;      #change of sugar concentration end code to this Python code: code #do computations v_mu = self.p_mu_max*v_S/(v_S + self.p_Ks) v_X_dt = v_mu*v_X v_S_dt = -1.0/self.p_Yxs*v_mu*v_X code The generated code is then integrated with a larger simulation environment using SciPy and Numpy.

 =svg2pycairo - Read SVG files and display using Cairo= Donn Ingle has written a [|short utility] to read the M, L, C, and Z commands of an SVG file and then display/render the SVG graphics using [|Cairo]. Donn waxes poetic on the virtues of pyparsing...

 =svg2imagemap - Create image maps from SVG files= David Lynch [|describes in his blog] a little utility script for auto-generating HTML image maps from SVG files. His utility uses pyparsing to minimally extract SVG commands, and create the corresponding HTML image map definition. His website includes a (semi-)automatically generated [|image map] of this world map (from Wikipedia):

 =asDox - Actionscript class extractor= [|asDox] is an Actionscript3 parser that extracts AS class definitions, and generates a document structure that can be used for code generation, documentation, or further class operations.

 =Quantum Monte Carlo in Python= [|quameon] includes a parser of differential equations expressions. More info at http://www.markdewing.com/code_gen/lightning-sympy-code-gen.pdf.

 =pybtex - a BibTeX parser= [|pybtex] is a BibTeX parser, a replacement for bibtex, written by ero-sennin. pybtex parses bibliography entries like: code @BOOK{strunk-and-white, author = "Strunk, Jr., William and E. B. White", title = "The Elements of Style", publisher = "Macmillan", edition = "Third", year = 1979 } code and generates LaTeX, text, or HTML bibliography files.

 =Tunnelhack= [|Tunnelhack] is a text adventure game similar to Nethack. Tunnelhack uses pyparsing to parse its game config file.

 =madlib= [|madlib] is a madlib processor web service. Enter a madlib markup parsing string, and click : code 'Twas, and the Did and  in the : All were the , And the. code generates: code 'Twas projittableptorporal, and the bandlant keddah Did fum and penine in the mel: All pleuropodiforcinguidae were the antasco, And the proophanimpletaman fascent unmaringcratoria. code madlib uses pyparsing to extract the markup codes in the madlib input string.

 =poetrygen= [|poetrygen] is a poetry generator, that uses pyparsing to parse meter format strings: code A heroic couplet: "(ox)5A :1A" A whole haiku: ".5A .7B .5C" A limerick: "(oxo)3A ;1 (oxo)2B ;3 ;1" code

 =PyMLNs - Python Markov Logic Networks= Markov logic networks (MLNs) combine first-order logic with the probabilistic semantics of graphical models. [|PyMLNs] offers you inference and learning tools for MLNs. PyMLNs uses a pyparsing parser to process MLN formulas. Here is a screenshot of the PyMLNs query tool:

 =dsniff= [|dsniff] is a simple Python application framework for network monitoring. dsniff uses pyparsing to process filter commands: code 'tcp':{ 'p':[6] }, 'tcp or udp':{ 'p':[6,17] }, 'tcp and dst port 80':{ 'p':[6], 'dport':[80] }, 'tcp and dst port 22 or 80':{ 'p':[6], 'dport':[22,80] }, 'dst 1.2.3.4 and tcp and dst port 22': { 'p':[6], 'dst':[dnet.addr('1.2.3.4')], 'dport':[22] }, 'dst net 5.6.7.0/24 or 1.2.3.0/24 and tcp and src port 80 or 81': { 'p':[6], 'sport':[80,81], 'dst':[dnet.addr('5.6.7.0/24'), dnet.addr('1.2.3.0/24')] }, code

 =Bauble= [|Bauble] is a biodiversity collection manager. It is intended to be used by botanic gardens, herbaria, arboreta, etc. to manage their collection information. It is a open, free, cross-platform alternative to BG-Base and similiar software. It is written in Python using the python gtk toolkit.

Bauble uses pyparsing to parse application-specific search strings for locating botanic samples: code plant where accession.species.genus.genus=Ptychosperma and location.site="Block 14" code would return all of the plants whose genus is Ptychosperma and are located in "Block 14".

 =Firebird PowerTool= Pavel Císar used pyparsing to develop a command language for developing Firebird plug-ins. He presented his work at the [|Firebird Conference-2006].

 =Numbler spreadsheet web server= [|Numbler] is the open source engine that powers numbler.com. Basic features include: Numbler uses pyparsing to parse date strings of varying formats.
 * real-time spreadsheet collaboration
 * web services integration with amazon, ebay, and others
 * pluggable web service extension support
 * support for hundreds of languages via International Components for Unicode.

 =Zhpy - Chinese Python -> English Python preprocessor= [|Chinese project page] [|English project page] Zhpy allows Chinese developers using Python to write code using traditional Chinese. Zhpy provides a front-end to Python that scans through the Chinese source, and exchanges Chinese keywords for English Python keywords, which are understandable to the Python compiler. Here is a "Hello, World!" example from the project wiki. Here is the same original program, in English (for those of you who can't read Chinese):

 =Robot-centric instruction set assembler= Susan Gordon and Dr. James Wolfer of the Indiana University at South Bend have published a [|paper] at the International Conference on Engineering and Computer Education in Sao Paolo, Brazil in March, 2007, describing the robotic instruction assembler they use in the upper-level course C335, "Computer Structures". Using pyparsing, they created an assembler to convert assembly code such as this: code ld R4,$300   # wall ahead ld R5,$60    # clear ahead ld R6,$300   # side too close ld R7,$50    # side too far rinit forward: rspeed $1,$1            # forward at speed 1,1 rsens                   # sensor values to R8-R15 jlt   Rb,R4,lookleft    # if sensor3 < wall ahead value - check left rspeed $1,$-1           # else, spin right to turn at wall chclear: rsens                   # sensor values to check spin jgt   R9,R5,chclear     # if sensor1 > clear ahead value - keep spinning rspeed $1,$1            # else, reset speed forward lookleft: rsens                   # sensor values to check for left wall jlt   R8,R6,chtoofar    # if sensor0 < side too close - check too far rspeed $2,$1            # else, move away from wall slightly jmp   forward           # go back to check forward again chtoofar: jgt   R8,R7,forward     # if sensor0 > side too far - go forward rspeed $1,$2            # else, move closer to wall slightly jmp   forward           # go back to check forward again code to byte-level instructions that can be run using the Khepera robots used as part of the class.

 =ZMAS agent language (Zope Multi Agent Systems)= Eduardo Bastos has written [|ZMAS], a parser for the [|FIPA Agent Communication Language], running within a Zope agent framework. Here is an extract from a screenshot showing the configuration screen for an agent, showing the definition of an agent message.

 =pymon= [|pymon] is an open source network and process monitoring solution implemented in python. The interface and conifiguration is designed to be easily and rapidly deployed, saving on time and overhead often associated with other monitoring solutions.

pymon uses pyparsing to implement an interactive command shell. Here are some sample commands in pymon: code help show nodes show services show lists node add www1_anon_user www.adytum.us node add www1_auth_user pymonuser:asecret@www.adytum.us node show www1_auth_user service http-status add www1_anon_user path /test/index.html service ping add www1_auth_user enabled True org Adytum interval 20 binary /here/ping count 10 ok-threshold dummyokthresh warn-threshold dummywarnthresh error-threshold dummyerrorthresh failed-threshold dummyfailedthresh scheduled-downtime 2005.12.01 03:00:00 - 2005.12.01 04:00:00 code

 =Simulink MDL parser= Kjell Magne Fauske has developed a [|parser for Simulink MDL files]. These files are remarkably similar in form to JSON, so he was able to leverage the JSON parser example to parse these files. My favorite line from his blog: "Writing a full-fledged parser with Pyparsing is so easy that it almost feels like cheating." :)

 =Cubulus OLAP - MDX parser= Alexandru Toth has written the [|Cubulus parser] for OLAP [|Multidimensional Expressions (MDX)]. Here is a sample of a MDX: code Select {[time].[all time].children} on rows, {[region].[all region].children} on columns from alxtoth code

 =Django database schema migration utility= Alexander Koshelev has developed the [|Django Schema Evolution] utility. He is using pyparsing to extract table and index definitions from modified SQL schema definition language, for comparison against existing database schemas, in order to auto-generate SQL statements to perform the necessary table and index modifications.

 =Python script trace visualizer= Andrew Sutherland writes in his [|blog] that he has put together a prototype of a visualizing Python script display. The utility uses pyparsing to extract data from gdb's mi2 interface data, and then generates a trace file which is used to create an html file with highlighted syntax, plus a timeline-style trace to the left of the source.

Here is a screen image from his blog:

 =BLAST data file parser= C. Titus Brown (of twill fame) cobbled together this [|BLAST data file parser] over the weekend. He picked pyparsing for this task "because the BLAST format changes frequently, and I need to be able to maintain the parser. So readability of the parser code is VERY important."

 =pytnef= Petri Savolainen has developed [|pytnef] - the tnef module provides high-level access to tnef decoding; namely, listing contents of TNEF attachments and extracting and retrieving TNEF body/bodies and embedded files. Petri used pyparsing to create a special-purpose RTF parser specifically for extracting HTML from RTF contained within a TNEF attachment.

 =pylda= Romain Chantereau's Python Logical Data Access (pylda) project permits access to data via logical requests. (Here is the [|submission of pylda] to the Gna! project.)

The major impact of logical requests is that access to the information is non-hierarchical.

pylda is a translator between the logical requests and a storage container (fs, db, p2p, etc.). The logical requests consist of a combination of keyword and metadata elements (attribute). This combination follows typical boolean algebra. As a trivial example, requesting the picture taken on 11/07/2007, the request could be:.

 =PNPD/Nova= Tim Blechmann's [|PNPD project] is a dataflow audio programming language, that is based on the syntax of pure data with aspects of max/msp and jmax. PNPD uses pyparsing to specify an audio patching language:

code {   cerr<'foo>(loadbang | loadbang, init)
 * 1) define base canvas

foo = foo<1> bar = bar(foo[0], baz<1>[1]) bar[1] -> foo[2] baz -> baz canvas = { foo = foo<1> bar = bar(foo[0], baz<1>[1]) } } code

 =SFE= [|SFE], by Robert Cimrman, is a finite element solver used for simulations in (bio)mechanics and shape optimization of closed channels w.r.t. fluid flow. SFE uses pyparsing to process region definitions to define model constraints and initial conditions:

code nodes of surface -n region_1 nodes in (y <= 0.00001) & (x < 0.11) nodes in ((y <= 0.00001) & (x < 0.11)) nodes in (((y <= 0.00001) & (x < 0.11))) all -n nodes in (y == 0.00001) all -n nodes of surface all -e region_100 region_1 -n nodes of surface *e region_8 *n nodes in (y > 0) nodes of surface +n nodes by pokus( x, y, z ) elements of group 6 +e nodes by fn2_3c( x ) region_1 *n (region_2 +e (nodes in (y > 0) *n region_32)) -n nodes of surface -e region_5 code

 =Jppy= The JPPY project is a JPilot-Python API to access databases on the Palm Pilot. See [|these screenshots] to see the query/filter language that has been incorporated to jppy, using pyparsing.

 =grailmud= Sam Pointon has developed [|grailmud], a MUD development and runtime engine using pyparsing to perform user command processing.

 =ARFF2DB.py Python Cookbook Recipe= Pradeep Kishore Gowda has submitted a [|recipe] to the Python Cookbook for an ARFF (attribute-relation file format) parser combined with SQLAlchemy to store arbitrary objects into any database. See (http://www.cs.waikato.ac.nz/~ml/weka/arff.html) for more on ARFF.

 =Java Thread Dump Parser= Camel Richard (aka avidfan on comp.lang.python) describes in his [|blog] a parser for processing Java thread dumps, that read this: code "FILE Message Writer" daemon prio=5 tid=0x0093d7c0 nid=0xf in Object.wait [a4e81000..a4e819c0] at java.lang.Object.wait(Native Method) - waiting on (a java.util.LinkedList) at java.lang.Object.wait(Object.java:429) at com.sitraka.pas.common.util.queue.ListQueue.dequeue(ListQueue.java:137) - locked (a java.util.LinkedList) at com.sitraka.pas.common.log.FileLogTarget$MessageWriter.run(FileLogTarget.java:359) at java.lang.Thread.run(Thread.java:534) code

and extract this: code THREADNAME =    "FILE Message Writer" PRIORITY =      5 TID =           0x0093d7c0 NID =           0xf RUNSTATE =      in Object.wait MEMORY ADDRESS = [a4e81000..a4e819c0]

CONDITIONS:

- waiting on (a java.util.LinkedList)

- locked (a java.util.LinkedList) code

 =SQL-style LDAP query utility= Goerge Ang has developed an LDAP query tool ([|pysqldap.py]) that takes an SQL-style query and converts it to LDAP form.

This module parses SQL-like statements,and converting:
 * table name to LDAP query DN string
 * where expressions into LDAP queries filters

code >SELECT abc FROM cn=fakename,dn=fakedn.org WHERE name=gnap INTO 'output.txt' ORDER BY cn ASC LIMIT 3 code

the output would be:

code SELECT abc FROM cn=fakename,dn=fakedn.org WHERE name=gnap INTO 'output.txt' ORDER BY cn ASC LIMIT 3 -> tokens.columns = 'abc' tokens.basedn = cn=fakename,dn=fakedn.org tokens.where = 'name', '=', 'gnap' filter = (name=gnap) into =   ['output.txt'] orderby = ['cn', 'asc'] limit = ['3'] code

 =bitfield= Jeremy Kerr has written a utility for writing Linux device drivers called [|bitfield], a tool to inspect the individual bitfields of a register. bitfield uses platform register definitions, which he parses using a pyparsing grammar. code [SPU_Status] name: Cell SPU Status width: 32 field: 0:15 Stop-and-signal status field: 21 Isolate exit field: 22 Isolate load field: 24 Isolated mode field: 26 Stopped: invalid instruction field: 27 Stopped: single-step mode field: 28 Waiting on blocked channel field: 29 Stopped: halt instruction field: 30 Stopped: stop-and-signal field: 31 Running code
 * [cbea] Section 8.5.2

bitfield can then extract values from a register value as: code [jk@pokey ~]$ bitfield SPU_Status 0x20000082 decoding as Cell SPU Status 0x20000082 [536871042] Stop-and-signal status: 0x2000 Isolate exit: 0x0 Isolate load: 0x0 Isolated mode: 0x1 Stopped: invalid instruction: 0x0 Stopped: single-step mode: 0x0 Waiting on blocked channel: 0x0 Stopped: halt instruction: 0x0 Stopped: stop-and-signal: 0x1 Running: 0x0 code

 =clipartbrowser= Greg Steffensen has developed a GUI [|clipart management program] integrated with the Open Clipart Library to manage clip art. Greg uses pyparsing to implement a simple query interface to select clipart entries by matching tags.

 =WURFL - Wireless Universal Resource File= Armand Lynch has written a Python wrapper for the [|Wurfl] project, using pyparsing to provide an SQL-like [|query language] for retrieving specific wireless devices matching some set of characteristics. Here are some sample queries: code q1 = """select id where ringtone=true and rows < 5 and columns > 5 and preferred_markup = 'wml_1_1'"""

for wurfl_id in query(q1): print wurfl_id

q2 = """select device where all(ringtone_mp3, ringtone_aac, wallpaper_png, streaming_mp4) = true"""
 * 1) Let's look for some nice phones

for device in devices.query(q2, instance=False): print device.brand_name, device.model_name code
 * 1) Notice that we can also retrieve device classes

 =CTL4J - Component Template Library for Java= Boris Buegling is using pyparsing to parse [|CTL4J]'s interface definition language, to autogenerate Java stub code. This component interface definition: code
 * 1) define CTL_Class CTL_Registry
 * 2) include CTL_ClassBegin


 * 1) define CTL_Constructor1 (const string /*filename*/), 1
 * 2) define CTL_Method1 location, operator, (const string /*CI_Type*/, const any /*property*/) const, 2
 * 3) define CTL_Method2 location, get, (const string /*CI_Type*/, const any /*property*/) const, 2
 * 4) define CTL_Method3 void, regist, (const string /*CI_Type*/, const any /*property*/, const location), 3

code generates Java code for abstract stub classes.
 * 1) include CTL_ClassEnd

 =XAQL - Xapian Query Language Parser= Michel Pelletier describes his query language for the Xapian database here. In his posting, he gives this example of an input query: code select xtitle, xlastposted where xstatus in(published, reposted, expired) and xcountryid=4 and xsubcatego\ryid=41 and xsalestype IN(individual, professional) order by xlastposted limit 10 code which he compiles into this Xapian query: code Xapian::Query(((XSTATUSpublished OR XSTATUSreposted OR XSTATUSexpired) AND XCOUNTRYID4 AND XSUBCATEGORYID41 AND (XSALESTYPEindividual OR XSALESTYPEprofessional))) code  =TAPPY - Tidal Analysis Package in PYthon= pyparsing is used to flexibly define the data file parser for this [|tidal analysis package] by Tim Cera. Tidal data is provided in a variety of formats, TAPPY uses pyparsing to adapt to these different formats to read in historical tidal data.

 =matplotlib= pyparsing has been included as the TeX parsing engine for John Hunter's [|matplotlib]. matplotlib's [|mathtext] module contains a grammar for a subset of TeX so that it can be rendered into mathematical symbols. Here are some examples of TeX expressions from the matplotlib project page: code \int_a^b f(x)\rm{d}x code code \displaystyle\sum_{n=1}^\infty\frac{-e^{i\pi}}{2^n} code code \frac{\partial \phi}{\partial t} + U|\nabla \phi| = 0 code code \mathcal{F} = \int f\left( \phi, c \right) dV, $ \newline $ \frac{ \partial \phi } { \partial t } = -M_{ \phi } \frac{ \delta \mathcal{F} } { \delta \phi } code
 * \nabla\phi| = 1, $ \newline $

 =ldaptor= pyparsing has also been included as the parser for LDAP filters in [|ldaptor], a pure-Python LDAP module by Tommi Virtanen. ldaptor uses pyparsing to compile LDAP filters such as: code (cn=Johnny Carson) (!(cn=Conan OBrien)) (&(objectClass=Person)(|(sn=Letterman)(cn=Johnny C*))) (&(!(|(cn=Jay Leno)(cn=David Letterman)))(sn=a*b*c*d)) code See Tommi's [|talks] on ldaptor, presented at EuroPython '04.

 =pydot= [|pydot] is a Python module for creating and plotting both directed and non directed graphs. pydot uses pyparsing for importing graph data from DOT files. [|DOT] is a language for representing graphs, as part of the AT&T open source [|GraphViz] software. Here is a sample DOT description of a directed graph (from the online DOT Language reference): code format="c" digraph G { main -> parse -> execute; main -> init; main -> cleanup; execute -> make_string; execute -> printf; init -> make_string; main -> printf; execute -> compare; } code which represents this graph: Ero Carrera (the author of pydot) sent this feedback:

"Thanks for your work in pyparsing! Michael Krause managed to add a really great dot file parsing feature with it, which would have been really tedious to implement otherwise."

 =twill= C.Titus Brown has used pyparsing to construct a scripting language on top of mechanize to produce [|twill], a scriptable interface for interacting with web pages.

Here is a simple example using twill to access Google's main search page (note that automated access to this page violates Google's Terms of Service - this example is for demonstration purposes only!): code format="python" setlocal query "twill Python"

go http://www.google.com/

fv 1 q $query submit btnI # use the "I'm feeling lucky" button

show code

 =BIND named.conf parser= Seo Sanghyeon reports using pyparsing to tokenize named.conf files for BIND. These files can be especially difficult to parse, since they can contain recursive structures, with embedded comments in /*..*/, ... and, #... format. Here is the entire tokenizing grammar//:// code format="python" from pyparsing import *

toplevel = Forward value = Word(alphanums + "-_.*!/") | quotedString simple = Group(value + ZeroOrMore(value) + ";") statement = Group(value + ZeroOrMore(value) + "{" + Optional(toplevel) + "}" + ";") toplevel << OneOrMore(simple | statement)

parser = toplevel parser.ignore(cStyleComment) parser.ignore("" + restOfLine) parser.ignore("#" + restOfLine) code

Using this grammar, Seo is able to read in the named.conf file, strip the comments, and create a structured parse tree for follow-on processing. (Not bad for 12 lines of code!)

 =Boa Constructor code conversion/upgrade utility= [|Boa Constructor] uses a pyparsing-based conversion utility to help developers upgrade their scripts written to wxPython 2.4 to wxPython 2.5 (which introduced the 'wx' namespace for their API routines). The following list is an excerpt from the utility header, listing the changes that the Boa Constructor update utility handles automatically:
 * EVT_xxx change from wxPython 2.4 style to 2.5 style
 * .Append for menu, changes 'helpString' to 'help', 'item' to 'text' and wxITEM to wx.ITEM
 * changes the 'map(lambda _init_whatever: wxNewId' to the new format
 * changes the classes from wxName to wx.Name check "self.specialNames" to see which special cases are handled
 * changes the 'init' from wxName.__init to wx.Name.__init check "self.importNames" to see which imports are handled
 * true and false to True and False
 * SetStatusText "(i=" keyword to "(number="
 * AddSpacer "n, n" to wx.Size(n, n)
 * flag= i.e. flag=wxALL
 * style= i.e. style=wxDEFAULT_DIALOG_STYLE
 * orient=wx to orient=wx.
 * kind=wx to kind=wx.

 =Process Stalker= [|Process Stalker] is a utility by Pedram Amini. Process Stalking is a term coined to describe the combined process of run-time profiling, state mapping and tracing. Consisting of a series of tools and scripts the goal of a successful stalk is to provide the reverse engineer with an enjoyable interface to filtered, meaningful, run-time block-level trace data. The utilities included with Process Stalker include parsers for:
 * bpl_parser
 * recording_parser
 * register_metadata_parser
 * xrefs_parser

 =MathDOM - MathML parser= [|MathDOM] is a Python module that parses MathML and literal infix terms into a DOM or lxml document and writes out MathML and literal infix/prefix/postfix/Python terms.

 =Distribulator - distributed admin command parser= The [|Distribulator]: Distributed Computing For The Rest Of Us. This is an SSH-based command execution and file transfer utility that includes support for batch, console, and shell integration modes, multiple server enviornments, and full audit logs.

 =Spyse - Python Multi-Agent System Environment= The [|Secret Python Multi-Agent System Environment] is a platform for building multi-agent systems using Python. Andre Meyer & Co. use pyparsing to implement a parser of the [|3APL language], which is used among the collaborating agent objects. Here are some [|sample 3APL scripts] for two agents, harry and sally, who greet each other, and then thank each other for their greeting: code PROGRAM "harry" CAPABILITIES: BELIEFBASE: me(harry). you(sally). GOALBASE: RULEBASE: <- you(You) | Send(0, You, inform, hello(You) ), <- me(Me) AND received(V, You, inform, hello(Me)) AND NOT sent(V,You,inform,thanks(You)) | Send(0, You, inform, thanks(You) )

--- PROGRAM "sally" CAPABILITIES: BELIEFBASE: me(sally). you(harry). GOALBASE: RULEBASE: <- you(You) | Send(0, You, inform, hello(You) ), <- me(Me) AND received(V, You, inform, hello(Me)) AND NOT sent(V,You,inform,thanks(You)) | Send(0, You, inform, thanks(You) ) code

 =FileSystemStorage SQL parser= [|FileSystemStorage] is a DBAPI 2.0 compliant python driver for treating the file system as a database. Duncan McGreggor and Ravi Bhalotia are using pyparsing to implement the SQL grammar parsing, to access files as if they were entries in relational database tables.

"We needed to [write drivers, sql parser, and the "database"] in less than two months. Simply and shortly put, this would have been impossible without PyParser. We're about half-way done now, but I couldn't wait to thank you."

 =Verilog parser= I've also done some work on a Verilog language parser. Here is an example of code from that parser, showing the BNF extracted from the Verilog language spec, and the corresponding Python code using the pyparsing module: code format="python" """    ::= primitive                 (  <,>* ) ;                +                ?                                endprimitive    """ udp = Group( "primitive" + identifier +               "(" + delimitedList( identifier ) + ")" + semi +                OneOrMore( udpDecl ) +                Optional( udpInitialStmt ) +                udpTableDefn +                "endprimitive" ) code

(The Verilog parser used to be separately licensed, but is now generally available from the Examples page. If you end up using it commercially or in some way for-profit, please consider one of the charitable contributions listed in the parser's module header.)

Note how the '+' operator overloading makes the code map easily to the input spec, as well as the self-explanatory classes OneOrMore and Optional. The pyparsing module also supports the '^' operator for Or'ing alternative tokens, selecting the longest match found; and the '|' operator for selecting the first match among given alternatives.

media type="custom" key="290879"