jueves, 29 de diciembre de 2011

Invincible Perl



Have you seen the talk about DoSsing websites just using crafted data on forms (POST)? It's the trending topic of this week.

Well, here you have a couple of links related to this talk and some extra info.

The interesting thing that surprised me (or not so) on that article/talk is that the only language they tested that's not vulnerable to this attack is.... guess what? Perl. Here's the extract of the article:

Julian and Alexander did a great job with checking many programming languages used for web applications for their hash table implementation and hash functions. For all of them they checked, they managed to find a lot of keys mapping to the same output, except for Perl. Perl uses a randomized hash function, i.e. the hash doesn’t only depend on the key, but also on an additional value, that is chosen at startup of the application at random. All other languages also store the query parameters send in an HTTP GET or POST request in an hash table, so that a request with many query parameters all mapping to the same hash value will slowly fill such a hash table, before the first line of code written by the application programmer will be executed. Filling this hash table will usually take several minutes on a decent CPU, so that even a fast web server can be kept busy using a slow connection.

And here you have HN comments

+1 for Perl!
Between offtopic and related here's another nice talk from 28C3 that's having place these days in Berlin.

martes, 27 de diciembre de 2011

Git, the stupid content tracker, or not so


After roughly a year of using git daily, one has already crossed the "WTF!?" side and now is in the "It's obvious" land.

After getting help from all coworkers I had last year, lots (I mean *LOTS*) of reading, and many moments of "Am I the only one in the world that doesn't get this?", all regular processes go without thinking now, and I know pretty much what is happening under the hood.

Some things are still a bit raw on the edges, but mostly because they aren't used so often (submodules, bisect...), but I digress...

Thing is that using and knowing git gives you extra power, not directly related to versioning code. As Linus said: "git is the stupid content tracker", it manages blobs of bytes.

  • git grep: Probably faster than grep -ri, and more focused to what you surely want to search. I've already integrated it with emacs, and try to use it more and more, instead of rgrep, or ack.
  • git ls: great for the kind of find-file-in-project functionality.
  • git log -Sfoo : Search throughout the history
  • git log -p : modifications in context
  • git annex: manage whatever content
  • using git to deploy: There's capistrano, and puppet, and chef, and... but git can handle it if configured properly. Probably this can end in a mess if you need to trigger many things when deploying. You know, there are hooks and everything, and you cand build your poor-man-capistrano. It's just your choice. But definately for mostly static sites, it's a nice thing to keep in mind.
  • Not really a git feature, but thanks to magit, or fugitive, you can have a pretty painless integration with your workflow, so it's a win. And you feel safer

Have more tricks? Comment!

Ok, after a bit more than a month, there's a post precisely on that. Nothing new in the article, but there are nice insights in YC comments

domingo, 11 de diciembre de 2011