First of all, the refresh rates are quite a bit better on plasmas-- most are 600hz as opposed to the 60-240 in the LCDs.
Plasmas aren't my specialty, but it's my understanding that these numbers aren't directly comparable between plasmas and LCDs. The 600hz "sub-field drive" is essentially a marketing term invented as an answer to rapid-refresh LCDs. I don't even know if refresh rate is a term that makes any sense when applied to a plasma... they work on fairly different principles.
But there's one important thing to know:
I'm not sure how much faster this is in practice or if you can even notice it, but my brother has a 240hz LED and the lag just from the TV makes him unable to play certain games online with it (twitchy PvP games). So this is probably my biggest consideration.
Refresh rate has nothing to do with input lag - neither does response time, which is another thing often confused for it. Despite it being one of the most important considerations in buying a TV for gamers, input lag is a spec that isn't advertised by any manufacturer that I know of. To find out how laggy a TV is, you have to either read reports online from other consumers or bring equipment to the store and test it yourself.
There are ways to reduce input lag. Some TVs have a special "Game Mode" for this. You should also try to operate games at the native resolution of the TV so that it doesn't have to upscale or downscale them.
All that refresh rate affects is the perceived smoothness of the picture's motion. It works like this: a TV with a refresh rate of 120 or 240 hz analyzes video with a low framerate and interpolates new frames in between the existing frames. This makes video shot at 24 or 30 fps look like it was shot at 60 fps, which is about the fastest that the human eye can perceive.
Many people don't like this effect on TV shows and movies because they feel it makes them look "cheap" (soap operas and similar shows are shot at a higher framerate, and this duplicates that effect). But it's pretty great for games, especially for something capped at 30 fps like FFXIV (unless they fix it by release, which I hope to god they do). The effect can be turned down or off, in any case.
There may be a catch. When I was shopping a couple years ago, motion interpolation tended to introduce ugly artifacts into the video. Look up something called the triple ball effect. I've found that it's very noticeable in some games and almost invisible in others. The software has probably also been improved since then.
Secondly, the price. I can get a plasma for about half the price of an LCD, and about a third the price of an LED. Granted plasmas don't last as long, but I figure by the time it's time to replace it, an LCD will probably cost half as much as it does now anyway.
Other typical bad plasma stuff... it's heavier, more prone to burn-in (though I understand that this problem is nearly moot now). Anything else?
Lifespan and burn-in are pretty much non-issues at this point. Most videophiles prefer the picture of a plasma because it's richer and has better black levels. I'm surprised that they're cheaper than LCDs now, but make sure you're getting something with comparable specs. You should get one that displays 1080p, and you definitely don't want one that only does 1024 x 768.
Something else to consider is power use. LCDs are more energy efficient.