I'm doing that on 2 of my PC's. The one in our bedroom has HDMI running from the PC straight to a 40" TV. Sound is on a separate circuit via the motherboard.
For my main HTPC, I'm running HDMI out over to a 5.1 headunit, and then HDMI to a 46" TV. Sound is coming from the graphics card over the HDMI and is pushed throught the 5.1 headunit, which strips out the sound and the video is passed on to the TV.
Whether or not the image "sucks" depends on the connection you're using and the TV itself. If you're trying to run sound and video, go with HDMI. If you're trying to run just video, go with HDMI, DisplayPort (if the TV has it), or DVI (if the TV had it). I would not recommend the old VGA standard unless that's the only option you have.
There is no real "video" slowdown that inherent to running it over a TV itself. There are things from both the TV and Video Card side you'll have to tweak, like how it displays color: my 40" was hypersaturated, my 46" was fine. However, my 46" had over/underscan issues. All of those can be corrected via AMD's Catalys Control Center or Nvidia's Control Panel.
Assuming you're running it for gaming, chances are you are not running onboard video. It's possible to do so, but you might run into perfomance issues that are inherent with onbord video itself. No mater what you run, though, make sure you disable the onboard audio chipset--unless you're going to pipe sound over the HDMI, then enable it and disable all other sound outputs.
Finally--one thing to realize is that games are in general designed for a monitor up-close. For a few games I have run, the subtitles or any other in-game context can be hard to read unless you have eyes of a hawk. Alice: Madness Returns is a prime example. On a 24" screen, text 1/10" high is fairly easy to read from about 2' away. That same text on a 46" screen is about 2/10" high, but you may be looking at it from 10-15 feet away.