As someone working in the game industry and with 7 titles shipped, let me just tell you all this.
On a 200-man team, there's usually only a handful of people other than the testers (and testers test functionality. Sadly, most tester's opinion of what is allowable or not isn't worth sh*t to the dev. team. So they see "Does it work?" and have two answers: "Yes" or "No". There is no "Yes, but it sucks." or "Yes, but it could be faster" option) who actually play the game they're working on. Most of the engineers and artists will simply try their features in a gym area (sandbox), or simply stress test them outside of the game environment. Those who do play their own game are usually praised (or frowned by wasting so much time not "working")
The U.I. lag and the bugs and everything people complain about? Probably features to fix that have been de-prioritized and push-based as post-launch fixes because something else equally important had to be done before ship-date.
And most people in the team who actually care (the designers) will struggle to get the important parts fixed, only to helplessly shrug as the deadlines and task list being shuffled around by the leads and development directors.
It's not a rosy world where every game that ships the door has been given a thumbs up by every member of the team, believe me.
Hi. This is my first post...hurrah..
I'm a Quality Assurance analyst for a small development house. We don't make games, but I'm familiar with the development cycle, so I was taken back by this comment:
Sadly, most tester's opinion of what is allowable or not isn't worth sh*t to the dev. team. So they see "Does it work?" and have two answers: "Yes" or "No". There is no "Yes, but it sucks." or "Yes, but it could be faster" option
Either your testing team sucks or they were given terrible requirements. In the QA world we use metrics...a measurement of quality.
Most people don't actively think of these. Let's say you load up the ZAM site. How long does it take to load on the average internet connection? How long *should* it load and be acceptable from a user's standpoint? If google.com took 5 minutes to load you can imagine the user/customer/whatever would be pretty mad about it. It's measurements like these that testers use to ensure quality.
So with a real testing team they should be asking questions like "How long should it take to go from this menu to that menu? How many clicks should it take to perform this certain task?" These are requirements that any QA analyst would need to use to ensure a quality product. If someone gives your testing team a copy and asks "does it boot up?" that's a pretty bad requirement, leading to a bad product. Without proper metrics, it's easy for a testing team to pass the product off as quality considering the requirements don't state *how much* lag should be in the menu (for example).
Sadly, this seems how a good deal of development houses work these days. Instead of their software development cycle being driven by the development team, they let the sales and marketing team run it. This means a deadline is usually set well before the programmers and testers know what's about to hit them, then they're in a rush to complete the project before the arbitrary deadline set by the sales team. When things are rushed, requirements are rushed and sloppy, which causes programmers to write crap code, and testers to pass because the only requirement is that it doesn't crash or blow up. No doubt SE did this. Also, no, the developers DO NOT play their own game. They do what's called "Unit Testing", which is a basic test to make sure a section of code can stand on its own two legs (so to speak). Each dev does this for their own code but rarely testing dependencies (all the code that depends on what they're working on). That's what QA Analysts and beta testers are for!
I'm willing to bet the SE dev team is dreading each day they go into work since the launch.