Post a reply to the thread: DVI to HDMI on Gigabyte Z68?
You may choose an icon for your message from this list
Will turn www.example.com into [URL]http://www.example.com[/URL].
yeah, its a good idea, but for mobile/laptops its really only useful IMO, or maybe a Mini ITX system with a low end graphics card thats better than the intel stuff, but for a gaming PC, even if its a Mini ITX system with a higher end graphics card, its really kind of pointless. Laptops, even though the Discreet GPU is considered "Dedicated" its still part of the system much as the "onboard" video would be anyways, and makes sense it would use the same outputs.
If intel had better onboard video like AMD does with their "Fusion" platform, they wouldn't really need this "hybrid" ability. Take the Llanos platform for example, has a decent Radeon 6500 Series GPU on the CPU. you can still Crossfire it with a similar grade card, or opt to not use it and put in a much more powerful dedicated card.
Ahh that's right, I recall reading some info on that setup awhile back.
Using the on-board just because its an HDMI connection would actually be counter intuitive as your graphics card is more powerful than the on-board video.
Just to clarify with what their wanting to do.
They're intending on using the Lucid Virtu (or whatever its called) software and the Z68 board to be able to use the 8800GT through the board and use the HDMI thats on the motherboard to output to.
Its a Hybrid graphics system, allowing you to use either the Dedicated or onboard video for various tasks, all using the motherboards outputs instead.
I was under the impression the onboard would use the video cards outputs instead, but its just the opposite, there was an article I read earlier detailing setting this all up, needing both Mobo & GPU connected to the monitor or swapping back and forth, but once all set up, you only connect the Mobo to the monitor.
Its a lot of pain in the arse setting up to work right, and really pointless IMO, just get a display with Digital Inputs and use the dedicated GPU and call it a day.
Merged now, noticed the first post the other day and was scratching my head. Now looking at the system specs I'm scratching my head even more. The OP would have been tons better of shaving costs on the build and adding in a newer GPU and better monitor right from the start. But that's all hindsight now.
Going from an older LCD without digital inputs to a new larger LED with, will be a noticeable improvement. Your images will be much sharper and brighter and you will have the option of running the higher resolutions.
Now to get to a bit of clarity for you. High Definition (HD) is term describing a specific set of resolutions - 1,280×720 pixels (720p) or 1,920×1,080 pixels (1080i/1080p). As bone said its not a matter of converting your pc to HD. Its simply whether or not your video card and monitor support those resolutions. Your current monitor doesn't natively support 1,920×1,080 but your computer is fully capable of pushing out the signal at that range.
Now secondly, as myself and Shy mentioned, using HDMI over DVI isn't going to be any different. Using the on-board just because its an HDMI connection would actually be counter intuitive as your graphics card is more powerful than the on-board video.
You are saying the games you run now almost all run on high setting with that older card. This tells me you aren't running any of the new highly graphically intensive games or you aren't running them at the highest resolution you can. If you only buy your new monitor and run it at 1920×1080 you might actually see a drop in game performance as the higher resolution will be more demanding on your graphics card. So you might go with Shy's advice and get both at the same time, if you can.
I had a long post in that regards as well, but figured to give up on it...
they're not getting the point.
Rich, can this thread be merged with the other one, since it IS on the same topic as the first one anyways?
I looked up their monitor and it only has a VGA connection to it, no DVI or HDMI output.
the VGA should still be capable of HD resolutions, just the image quality might suffer a bit, and any sort of HDCP protected content may not work, since you'd need a DVI, HDMI, or Display port for HDCP compliance.
If it were me, I'd consider getting both parts in an upgrade, better monitor and video card.
Though ideally a GTX 560 Ti would be nice, you could probably get a GTX 560 (non Ti) and a New monitor for the cost of a GTX 560 Ti 448 Core card.
Also the 448 Cards are limited editions, left over defective GTX 570 chips. for another $30-50 more you can get a fully functional GTX 570.
The 448 cards also use 40W more power than the 384 core cards. They also don't perform THAT MUCH better than the prior models either, maybe 10-15fps better in some games/resolutions. $220-250 price tag, versus $290+... Even a highly Overclocked 384 core card, costs less and is closer in performance.
Only real advantage to them is their Triple SLI Support. you can gain a 3 way SLI setup for a bit less than it would cost for the full 570 cards.
Also what I don't understand is why are you bothering with the Z68's/Sandybridge onboard video anyways?
Just because it has HDMI? Big deal.... if you got a new monitor with DVI or HDMI, then just plug the 8800GT straight into it, bypass the onboard and forget about it.
simple as that.
your not going to save much in power consumption using the Hybrid mode anyways, as the 8800GT will still be sitting there idle using some power, so not really worth it IMO.
If it were me, I'd be looking at Monitor and lesser video card, if you plan on running dual cards down the road with an SLI motherboard, then getting a bit lesser spec card, and running a 2nd is feasible.
Newegg.com - Recertified: PNY RVCGGTX4601XXB-OC GeForce GTX 460 (Fermi) 1GB 256-bit GDDR5 PCI Express 2.0 x16 SLI Support Video Card
Newegg.com - Acer S231HLbid Black 23" 5ms HDMI LED-Backlight LCD monitor Slim Design 250 cd/m2 ACM 12,000,000:1 (1000:1)
just under $300 total.
its not a 560 Ti, but its still a very capable card, and many times more powerful than the 8800GT you have now.
460's are harder to find these days, but still quite a few of them out there.
I'm not really certain why you are over emphasizing the HDMI aspect here as HDMI is simply the connection type and not much advantage wise over your current DVI connection. The only real advantages of a newer larger screen would be the added size and support for higher resolutions.
My new build has a Z68 motherboard that supports HDMI so i was thinking maybe new monitor for now, then purchase the newer 448 stream processor card in the spring when the prices come down a bit. My build below.
GIGABYTE GA-Z68X-UD3H-B3 LGA 1155 Intel Z68 HDMI SATA 6Gb/s USB 3.0 ATX Intel Motherboard
Intel Core i7-2600K Sandy Bridge 3.4GHz Quad Core
G.SKILL Ripjaws X Series 16GB 240-Pin DDR3 SDRAM DDR3 2133 (PC3 17000) Desktop Memory Model F3-17000CL11Q-16GBXL
Crucial M4 CT128M4SSD2 2.5" 128GB SATA III MLC Internal Solid State Drive (SSD)
OCZ GameXtreme 700 PSU
EVGA 8800GT GPU
COOLER MASTER Hyper 212 Plus
Corsair Carbide Series 400R Graphite grey and black Steel / Plastic ATX Mid Tower Gaming Case
Windows 7 Home Premium 64Bit
SONY Black 18X DVD-ROM 48X CD-ROM SATA DVD-ROM Drive Model
Seagate 160Gb External HD
That's a tough one...but it does sound like a new monitor would be cool. AND, you can use the old monitor too to have duals..
New HDMI monitor or GTX 560 ti GPU?
Please help me decide whether to get a new monitor ASUS VS248H Full HD LED monitor, with 50,000,000:1 high contrast ratio and HDMI interface 1920x1080p res. Or a new GPU, MSI N560GTX-Ti Twin Frozr II/OC. Right now my monitor is about 5 years old Acer X221W 1680x1050 res with NO HDMI capabilities.....And my graphics card is a EVGA 8800GT which can play most games on high with my new build. I am leaning towards the Asus HDMI monitor at the moment as i can only afford to buy 1 of the above right now. Any thoughts or opinions to help me choose?
New HDMI monitor or GTX 560 ti GPU?
I would think the Board would switch the graphics from Onboard to Dedicated Card, and just use the video cards output, really defeats the point needing to use both outputs to a monitor since some monitors only have one output.
reason why you might be getting a blank screen, is the monitor picks up the first signal it gets, and has no way of switching between connections to it.
I am running a Gigabyte Z68X-UD3H-B3 board with LucidLogix Virtu GPU virtualization technology and a Intel Core i7 2600k Sandy Bridge Quad with Intel built in HD graphics 3000. I kept my 8800gt connected, then plugged HDMI cable into Z68 board, then to other DVI port on monitor so i have both connections hooked up on PC. Not sure exactly how to run this setup properly but it seems to be working, I also have noticed that sometimes when i boot up i get blank or black screen so i have to unplug HDMI from board, then i get my picture back? I think i can switch back and forth from built in to dedicated graphics but cant seem to tell if its working or not. Would the setting in bios be the VGA which is set to auto?
Hi Captain Kirk, and welcome to the forum...
Your PC is not 'converted to HD' Your video output is coming either from the 8800, or the on-board chip-set... As such, you can use both (one at a time) with a BIOS change at start-up. If you post more info regarding the motherboard specs (exact model), there is more to learn..
Welcome to TechIMO.
Either the DVI or HDMI connection will enable an HD (1920x1080) image on your monitor. It makes no difference.
DVI to HDMI on Gigabyte Z68?
If i plug a DVI to HDMI cable into the HDMI port in my Gigabyte Z68 mobo, then into the DVI port on my monitor, will this make my PC HD? Also, right now i have a DVI connected directly into my 8800GT from monitor, if this HDMI connection works converting my PC to HD, what happens to my 8800GT card? Does it become obsolete or can i switch configurations back and forth if i leave both hookups connected to my monitor?
DVI to HDMI on Gigabyte Z68?