Skip to main content

PyCon: Code Generation in Python: Dismantling Jinja

See the website.

See also bit.ly/codegeneration.

Is eval evil? How does it impact security and performance?

Use repr to get something safe to pass to eval for a given type.

Eval code in a different namespace to keep namespaces clean.

Using code generation results in faster code than writing a custom interpreter in Python.

Here is a little "Eval 101".

Here is how to compile a string to a code object:
code = compile('a = 1 + 2', '', 'exec')
ns = {}
exec code in ns # exec code, ns in Python 3.
ns['a'] == 3
In Python 2.3 or later, use "ast.parse('a = 1 + 2')", and then pass the result to the compile function.

You can modify the ast (abstract syntax tree).

You can assign line numbers.

You don't have to pass strings to eval and exec. You can handle the compilation to bytecode explicitly. You can also execute the code in an explicit namespace.

Jinja mostly has Python semantics, but not exactly. It uses different scoping rules.

Lexer -> Parser -> Identifier Analyzer -> Code Generator -> Python Source -> Bytecode -> Runtime

Everything before the runtime can be done ahead of time and cached.

Because WSGI uses generators, Jinja also uses generators for output.

You can run untrusted template code with Jinja. They restrict what the Python can do. (I'm skeptical.)

They have automatic escaping.

In the art of code generation, you must think in terms of low level vs. high level.

(I got a little confused at this point about whether Jinja generated bytecode, ASTs, or Python source. Later in the talk, it seemed like he was saying that Jinja always generated Python code because it was the only workable option at the time.)

Using the ast module only became an option later.

He thought about generating bytecode. However, that doesn't work on Google App Engine. Furthermore, it was too implementation specific.

Using the ast module is more limited. However, it's easier to debug. Furthermore, it does not segfault the interpreter (at least starting in Python 2.7).

Using pure source code generation always works. However, it's very limited, and it's hard to debug without hacks.

The ast module is much better.

Jinja is way faster than Django templates.

Code running in a function is faster than running at global scope because local variable lookup is faster.

They keep track of identifiers and track them through the source code.

The context object in Jinja2 is a data source (read only). In Django, it's a data store (read write).

What happens in the include stays in the include. An include can't change a variable in an outer scope.

Jinja looks at your template and generates more complicated code if your code needs more complicated code.

{% for item in sequence %} creates item in a context that's only valid in the for loop.

Jinja used manual code generation because it was the only option. AST compilation is new in Python 2.6.

A Markup object wraps a string, but has autoescaping. It uses operator overloading. Jinja can do some escaping at compile time.

Undefined variables in Jinja are replaced by undefined objects so that they print out as empty strings. However, doing an attribute lookup on such an object raises an exception.

He would use the ast module if he had to do it all over again.

Comments

Popular posts from this blog

Ubuntu 20.04 on a 2015 15" MacBook Pro

I decided to give Ubuntu 20.04 a try on my 2015 15" MacBook Pro. I didn't actually install it; I just live booted from a USB thumb drive which was enough to try out everything I wanted. In summary, it's not perfect, and issues with my camera would prevent me from switching, but given the right hardware, I think it's a really viable option. The first thing I wanted to try was what would happen if I plugged in a non-HiDPI screen given that my laptop has a HiDPI screen. Without sub-pixel scaling, whatever scale rate I picked for one screen would apply to the other. However, once I turned on sub-pixel scaling, I was able to pick different scale rates for the internal and external displays. That looked ok. I tried plugging in and unplugging multiple times, and it didn't crash. I doubt it'd work with my Thunderbolt display at work, but it worked fine for my HDMI displays at home. I even plugged it into my TV, and it stuck to the 100% scaling I picked for the othe

ERNOS: Erlang Networked Operating System

I've been reading Dreaming in Code lately, and I really like it. If you're not a dreamer, you may safely skip the rest of this post ;) In Chapter 10, "Engineers and Artists", Alan Kay, John Backus, and Jaron Lanier really got me thinking. I've also been thinking a lot about Minix 3 , Erlang , and the original Lisp machine . The ideas are beginning to synthesize into something cohesive--more than just the sum of their parts. Now, I'm sure that many of these ideas have already been envisioned within Tunes.org , LLVM , Microsoft's Singularity project, or in some other place that I haven't managed to discover or fully read, but I'm going to blog them anyway. Rather than wax philosophical, let me just dump out some ideas: Start with Minix 3. It's a new microkernel, and it's meant for real use, unlike the original Minix. "This new OS is extremely small, with the part that runs in kernel mode under 4000 lines of executable code.&quo

Haskell or Erlang?

I've coded in both Erlang and Haskell. Erlang is practical, efficient, and useful. It's got a wonderful niche in the distributed world, and it has some real success stories such as CouchDB and jabber.org. Haskell is elegant and beautiful. It's been successful in various programming language competitions. I have some experience in both, but I'm thinking it's time to really commit to learning one of them on a professional level. They both have good books out now, and it's probably time I read one of those books cover to cover. My question is which? Back in 2000, Perl had established a real niche for systems administration, CGI, and text processing. The syntax wasn't exactly beautiful (unless you're into that sort of thing), but it was popular and mature. Python hadn't really become popular, nor did it really have a strong niche (at least as far as I could see). I went with Python because of its elegance, but since then, I've coded both p