Pretty much every game handles networking differently depending on the genre and needs of the game, I'd go into more detail but it would end up being an entire book.
The basics is that action games are typically reaction based, if you and your enemy are shooting each other the server needs to send proper reactions fast enough for players not to notice any latency (which there is a bunch, no matter what). The server must send data much more often in order to give an FPS shooter game enjoyable game play.
Keeping in mind some MMOs also offer PvP play which latency will heavily impact as well and the fact that the mobs are generally server sided in the typical implementation.
When I talk about latency I'm not talking about ping; I call that lag. I'm talking about the inevitable overhead time a certain network architecture adds for one player to see the actions of another player. In our case imagine all players had good ping of less than a hundred, they would still have a few hundred MS before they see the results of another player. This inevitable overhead time is usually mitigated by higher state update rates, using extrapolation, or both. Extrapolation is normally a fickle mistress and is hard to get right.
Even in pvp for most MMOs latency overhead is easily hidden and does not matter. In the types of MMOs I'm talking about, the only way to damage another player or npc is by using a skill(even regular attacking is a skill with a tick interval), and all skills are target based or AOE based. It's not like you're shooting a projectile that needs precise timing to collide with the enemy player. RNG determines if your skill hit or not and your client just makes the "projectile" follow them and hit them or not depending on the result, even if they teleport from lag. MMOs can have a lot of lag since latency is hidden from the players much better and much more easily. You can tell WoW doesn't bother with extrapolation, as if you lag out or DC players will stop moving rather than continue running along the same path until the server corrects them(but the running animation still plays, so it extrapolates that much). This signifies they use interpolation instead, which means they add at least 50ms-100ms of latency overhead (I don't really know the estimate). So they don't even care about a few hundred ms overhead (both player's ping + overhead from network implementation) and indeed it isn't a problem for most players until they get more than 150ms ping to the server and their skills start taking a recognizable amount of time to register.
The reason you don't see very many FPS MMOs is because a M(assive)MO game is very hard to network as is. Using these slower update rates under a more robust low cost infrastructure gives them the ability to have such a massive player base since there isn't near as much processing happening per player. At that point UDP becomes somewhat unnecessary as well(Although I think they should still use it, but meh), and that's why you see games like WoW using TCP.
Anyway, this is getting pretty off topic!
