Parsing user input, creative data, and HTML is hard. I'm sure you've whipped up some pretty creative regular expressions thatalmost work most of the time when dealing with these things, but it's often easy for attackers, or even non-maliciousend-users to trip up your carefully crafted regex and make your pages look horrible (or worse). In this talk, we'll discuss a fewpractical examples of how taking a token-based approach to parsing code and markup can save you plenty of time in the long run, andmore importantly, will actually prevent your replacements from failing.


Comments are closed.

Most excellent talk. Way over my head for most of it. Really liked the validation with tokens especially the simple email validation script.

Good talk, explained the idea of tokenizing in a way that the room really seemed to graps.

I've been working with email templates for web2project and hadn't come up with a nice clean way of handling them... this looks like a much better and cleaner strategy that I'm going to investigate.