If you invest in Socket 2011 and want a 3930k I can probably sell you one for like $400. I want to upgrade to a 4930k.
Printable View
If you invest in Socket 2011 and want a 3930k I can probably sell you one for like $400. I want to upgrade to a 4930k.
AMAZON SUPER SAVINGS:
http://www.amazon.com/ASUS-PQ321Q-31.../dp/B00DJ4BIKAQuote:
List Price: $3,499.00
Price: $3,498.95 & FREE Shipping. Details
You Save: $0.05
Doesn't change the WANT factor, however.
You want a monitor that is actually just two monitors stiched together and requires you do use two HDMI 1.4 or one DisplayPort connection (which is then passed through a demultiplexer that splits the data in to two streams)? Also something that basically only works on Nvidia cards with the latest drivers because otherwise the display appears to have sync issues between the two halves?
LOL.
I'll wait for a real 4k monitor/TV.
^
Also, can monitors of the future have ultra-thin bezels plz? kthxbai
This is probably going to be what I'll be using as my dedicated living room Steam Box:
http://www.youtube.com/watch?feature...;v=YlW6hw0WUpw
I've already built a gaming PC with the Elite 120, so this one should be even more awesome since it's basically a replacement for the Elite 120.
I still like the Silverstone Fortress FT03-Mini myself. Has a sleeker shape for something I would want to put in my living room. If not that, a proper HTPC chassis would do since I could install SLI GPU's and pretty much whatever other hardware I wanted without worrying about space constrictions.
Does anyone know if you can use 1 graphics card with dual monitors, and your intergrated graphics with a third monitor simultaneously?
Most motherboards don't support simultaneous use of the integrated and discrete graphics cards like that. Plugging in a discrete GPU disables the integrated chip.
Most of the newer moderate to high end GPU's now support three of four outputs simultaneously though.
That depends on the motherboard really. If you have something recent, the integrated gpu should be an option in bios that you can set to enabled rather than auto. I used to do this for a third monitor when i was using a GTX 570.
Also AMD R9 290x in a few hours? I hope you sold your titans while they are still worth something Freelancer~
Titan's should be on par or better still.
http://www.anandtech.com/show/7457/t...r9-290x-review
good job AMD you made a nice thing. Try doing this with CPUs please!
Like I said, right on par with the Titan and months late to the table.
Although I could sell my two Titan's and buy three R9's... But I couldn't bear to go back to AMD. I use a FirePro at work and the drivers (Catalyst Control Center at least) are still terrible.
Titan is actually as old as the 680, Nvidia just used them all in Tesla compute GPUs until they felt it necessary to out it as a consumer GPU. They basically handed the last generation to AMD because they had a trump-card in their back pocket that would force AMD into the catch-up role. The R9 290X is definitely late, well-played Nvidia. It would be silly even for me to upgrade my two 7970s to an R9 290X, let alone for Freelancer to swap out two perfectly good Titans for two 290X in Crossfire. The gain just isn't there. The 7970 in CF (or R9 280X, if you really want to call it that) is only 12 FPS shy of two R9 290X in CF in Battlefield 3 at 3840x2160. Worth $1100? Nope. Never. At lower resolutions, the computing horsepower is irrelevant.
basically what warsaw said. my current monitor can only do 1600x900 at max res so the new generation is kind of dumb for me to buy into until I c a n get a 4k monitor at reasonable prices. if I had waited a month before getting my 7970 I may have waited for the 290x but meh. hopefully amds mantel api turns out to be on par wirg cuda. if nvidia stays in the lead and drops prices my next card is definitely an nvidia. that is unless ati keeps murdering nvidia in price/perf. even the titan at 700 is ridiculous imo.
...and if games stop being hopeless console ports, you could be waiting longer.
It's also worth keeping in mind that current PC games are ports designed for hardware from 2006 but that's not going to be the case for much longer. It could end up pushing affordable 4K gaming further away than expected.
I stand corrected.
Interesting... A card that will be able to utilize PCi-e 3.0... AMD has gotten me interestedQuote:
Originally Posted by Anandtech
Yeah, I don't expect it any time soon, but 3 years is a bit much, imo. When a reasonably priced 4k-monitor comes out I'll pick one up, it shouldn't be too long (im' thinking a max of a year or so before we see one in the sub-$800 range. which means the 2560x1440 or whatnot will eventually be at the $300-250 range. What I look forward to more than that, is a monitor that's ~22" at 2560x1440, which is about the same ratio as the Retina displays on iPads. No point in buying a beast of a monitor if it's got worse pixel density, no matter how large it is.
It depends- 4k monitors have commercial viability outside of gaming as well, especially in the larger enthusiast HD-TV market.
That's not what the test data suggested. If you've only got one, sure, maybe. If you've got two, you're fine.
Quote:
Originally Posted by Btcc22
This is not quite an accurate conclusion to draw. Consoles in 2005/2006 were dead-even with all but the most bleeding-edge PCs as far as performance went. These new consoles are barely middle of the road when compared with PCs, and anything with an FX-8000 series or Core i5 CPU is already beating the crap out of it with regards to basic computing horsepower, since they are equipped with 8-core AMD equivalents to the Intel Atom processor. The GPU inside each is not even as good as an HD 7870, which is a third-tier GPU, and this is before touching on the reduced clock speeds.
What does this mean? This means that all of those optimization techniques that have made PC games run like a dream today (thank you consoles) will still be applied and will result in PCs using the best we had in January remaining competitive in the future. These consoles can't do 2160p. They will barely be doing 1080p when they finally start looking next-gen. Me? I'm going to be sitting here comfortable with my 1440p monitor, still playing games at their highest settings in 4 years. This is why you never buy mid-tier: it's a shit investment and harder on the wallet than just going all-in to begin with.
Except if AMD holds to its promise with Mantle we may see games being handled at higher resolutions than before.
BF4 will be supporting Mantle and the 7xxx range AMD cards have support by default.
If AMD is true to its words about 9X the drawcall performance over DX11 then one could guess that games using Mantle may be more easily handled at higher resolutions even if their fidelity is par with DX11.
AMD isn't in a position to really provide that figure. Drawing passes vary per engine, so the most they could do would provide a time in milliseconds to perform specific tasks.
The 9xDP figure came from DICE, so that's probably just scratching the surface against what it can do.
I shudder to think what some true optimization in rendering could do with Mantle (cue Crytek).
But the potential isn't there, and that is really what counts. They can squeeze it, but the improvement to PC optimization is at about 90% (eyeballed) of whatever the optimization they get out of consoles, so it's basically lock-step. Anybody with an HD 7870 or better and an i5-class CPU will be just as well off playing games at the same quality settings as the new consoles, probably better.
All I was trying to say is that the new generation of consoles will most likely push 4K gaming further into the future than many expect, especially given that many haven't even considered the point. Whether it's only by a year or several years, the point stands.
It's still a good step-up in hardware over the last generation and running games that have been optimised specifically for the console hardware, even if you believe most optimisations have already been taken advantage of, at several times the resolution is still going to take a beefy rig.
You could also argue that with the change in architecture, we could see more games being able to 'scale' in order to take advantage of the extra power in PCs (or inversely, downscaling, to make them run on consoles), making it all the harder to run them at higher resolutions with everything cranked up.
On another point, at what stage do we decide that there's no need to go to aim for a higher resolution with PC monitors and gaming? At what point is it better for the extra power to be put towards making visuals better rather than simply running it at a higher resolution?
Anyway, wake me up when we can have 120hz 4K monitors.
Honestly, pure resolution is meaningless to me. 4K is a gimmick, and not a particularly useful one because it's just another 16:9 ratio. I'll be impressed when we get TVs and desktop monitors with pixel densities in excess of 300ppi and, as a bonus, are in some ration greater than 21:9 with a curvature. At that point, I think we'll be able to focus less on pushing more pixels and more on improving the details in the content being rendered.
Now I'm curious to see how G-SYNC impacts this 290X = Titan at half the price debacle.
Gonna be funny to see a 780/Titan running a game at 60 FPS look smoother than Crossfire 290X's running it at 100+...
I still find it funny that it took until now for someone to even patent the idea of clocking monitors to fit the frame rate.
ALL RIGHT, well, i've done most of my computer, fooled around with 6+ HDD's of 250GB trying to get them all in RAID, reinstalled Windows, had tons of dumb errors, and i am now on the final stretch for victory of my PC!
Total Specs up to this point: (prices are approximate, as I haven't remembered them all. If you ask, I can look them up fairly easily)
AMD 8150 FX 8-core 3.6 Ghz (<free> THANKS RAMBO, you're AWESOME!) (Windows Experience index: 7.6)
ASROCk 990FX Extreme9 (if i'd done some more searching, pci-e 3.0 would have been nice, but w/e. this'll last me until I can pay off the car)
16 GB of RAM (7.6 on the index) ($300)
ATi Radeon 7970 3GB GDDR5 (7.9) ($300)
Case: Rosewill Thor ($230)
Corsair 850W Gold Semi-modular PSU ($250)
Samsung Evo 126 GB SSD (~$100)
8200 Dpi mouse (no linux support :ugh:) ($30)
Salvaged Laptop Screen controller from ebay ($50) this screen has max res of 1600x 900, which actually my laptop hardware couldn't go to, it only went to like 1440xSomething, so that was a neat surprise.
Laptop screen stand from cardboard cutout box: $0
Sennheiser HD595 headphones ~$120 (Skyrim and Doom BFG... super expensive, it was like 90 bucks all together. :ugh:)
This hobby's kind of out of hand... I need to stop buying things. Yeah, God would probably want me to stop. Why did i even mention God? idk.
Attachment 3262Attachment 3263
The next step, in all honesty, is a 2560x1440 display, which I am NOT purchasing unless I can get it for sub-$200. I'll get, believe me.
1440 x 900, that's the old 16:10 widescreen resolution that came standard on 15" laptops.
You should buy another 7970 to complement that first one. :)
I'm liking the monitor stand :)
Also yeah get a second 7970 and turn that room into a oven~
Finally got around today to organizing my room so my computer setup is across from my TV, where my 5.1 surround sound is set up. Now I can finally hook my computer to it and use the speakers for sound :)
What I get to look up at everday :D
Also, for those who don't know, heres my computer specs:
Monitors:
Two Samsung SyncMaster 2233's 22" for the sides
Samsung 24" LED for the middle. (couldn't find the old ones on the side, didnt want to buy two more of this monitor :( )
Computer Specs:
CPU: Intel i7 3.1ghz
Motherboard: ASUS Maximus Gene V
RAM: 8gb Vengence RAM
GPU: Asus ATI Radeon HD 7850 2gb
HDD: 1TB Western Digital and a 500gb Western Digital
PSU: 750w TX Corsair
I'm thinking about getting another 7850 maybe so I can run crossfire... Not totally sure about it, but I think it would be pretty cool. Case is kinda small though, and thats another thing I'd like to upgrade, since I've had the case for almost 6 years and its getting kinda beat up.
You can also see in the corner of the first picture my home server. I use this for game development and other stuff, got a sweet repository and Windows Server 2008 on it. Runs like a champ :) ! I didn't build it, its an HP, and its pretty sweet. 4 storage bays and its really small, pretty fast as well. The "HP ProLiant MircoServer". Got a 1TB and another 250gb in it. I'd like to eventually upgrade from it, but for now, this works pretty good.
too bad no monitors below 27" have higher res than 1920.
also@ innergoat. I have a history of awesome computer stands:
http://fc04.deviantart.com/fs25/i/20...by_Dwood15.jpg
There was a time when you could get 24" monitors in 2048x1156 or some funky-ass resolution like that. Samsung made them, and they were awesome.
There was also once a time where you had monitors at 5:4.
16:10 is sadly the last bastion of PC-unique resolutions and the final hope for something other than a television-based experience.
Woe is life until 8k. Apparently we're all supposed to just stop caring then because pixel density is so crazy good.
Mac gets it point #1
2560-by-1600 IN A THIRTEEN-INCH LAPTOP.
2880x1800 display in a 15" LAPTOP.
See ebay: HERE
I'm tempted to grab one of those 2880x1800 for 150, and purchase a compatible controller board for $100. There's a guy who made his own... If we can find a 2880x1800 for ~$80 or less, it's an instant-buy, to be honest. Though, you can grab a 9" retina display for around the same price.
point #2
Edit:
I firmly believe the problem with monitors, is that the prices do not scale appropriately. As higher resolution and hertz etc comes out, the lower-end models should drop in price, by orders of magnitudes. The problem is that people still buy the lower end models, bringing up the price of said models, so there's a marginal if any, difference between the two. Thus, screen-makers are not as inclined to offer a good deal for their lcd screens except in the case of TV's which, by comparison, still suck.
My previous monitor was 5:4. If getting the Dell U3011 wasn't another $200 over the U2711, I would have gotten it. Alas, $200 for another 160 vertical lines was just not worth it. So, 16:9 for me it was. On the upside, it means I don't have letterboxing or distortion when using my Xbox 360 with it. :)
I am now going to cease any purchasing and save up for the day that monitor comes out.
Hi guys
in a vague attempt to not post something caked in sarcasm in an otherwise terribly depressing world.. building a machine which requires a bit of mobility (possible future LAN parties but also for my desk space) but should be capable of HD gaming with a bit of future proofing.. (no fucks given about 4k monitors currently)
any thoughts or is the below okay?
Power Supply: Corsair CX750 750W
Processor: Intel Core i7 4770K 3.5Ghz
Processor Cooler: Corsair H80i Water Cooling System
RAM: 16GB Corsair Vengeance DDR3 1600Mhz/PC12800
Motherboard: Gigabyte Z87N-WIFI Mini ITX
Graphics card: Nvidia GeForce GTX 780Ti 3GB
Hard Drive: Corsair 120GB Force LS Solid State Drive
2nd Hard Drive: Seagate Barracuda 1TB Hard Drive 7200rpm 32MB Cache
will be housed in this.. [/no mac pro inspiration]
Attachment 3273
I do feel like i'm limited by wanting it to be in a smaller form factor but it should be a fun build. I also am actually for once stumped on the os to install on it too. I would like to drop win7 ultimate on it but I know the whole optimised for 8.1 thing is pretty common too..
You won't suffer from having Windows 7 Ultimate on it for several years, and whatever comes after 8 has to be good because of the Microsoft Cycle (TM), right?
Otherwise, awesome build. Don't like the SSD, would prefer Intel, Samsung, SanDisk, or Crucial, but otherwise no gripes.
Go win7. Win8 isn't actually windows, but Microsoft's side-OS Tiles rebranded because new things scare people.
How have SSDs been over the past couple of years? Last I paid attention to them Cherryville just came out and was curb stomping everything else. That still the case?
Oh don't worry I'll be investing in a 1TB SSD from either intel or samsung, just not really ready to spend more than an xbox one cost on storage space yet when really all the ssd needs to do is load the os among other bits and pieces :)..
fake edit: plus i can't get away with being THAT selfish this close to christmas haha
real edit: thanks for the advice, i have pro/ultimate licenses for them, just personally not sold on either anymore.. perhaps steam os!!
:) <- link on a side note, that's cheaper than i was expecting, and the price per gig actually goes down compared to the past when price per gig would go up with each higher capacity.
You know, it might have been that. At the time, I simply yanked the drive and re-installed everything onto my 10k because I needed my computer up and running. I never bothered to look into it after that. The symptoms sound right.
If it was that, you could of still updated the firmware and got it working after it crapped out. That's the only wide-spread issue that effected the Crucial M4's that I ever saw.
@Rosco, the Prodigy cases are definitely already for LAN and mobility use... However lots of people complain that the feet/handles aren't sturdy enough for them. Case will wobble when it's just sitting there and I don't really trust the top handles to carry a fully loaded case.
At the time, I wasn't aware there was a bug. I still have the drive, it's even still in the computer. I just need to reconnect it and update the firmware, that's all.
Interesting thing about the case.. no even if it was stated as super reliable i'd be more inclined to carry it like a normal pc only because of the expensive stuff inside it haha. The reason I actually went for that case is more of an appealing design thing, it's not very expensive so if it doesn't work I'll replace it with something stronger if I have to :)
nothing a little reinforcing can't fix if the build quality isn't too good! :)
PC keeps crashing without showing hte blue screen of death.
Is it more likely to be a hard drive problem or a GPU problem?
On a completely other note:
Dell prices it's 24" 4k monitor at.... $1,399 BANG.
It is essentially that. I can find screens that are 2560x1440 on the net for decent pricing, and a controller board and you have yourself a diy high-res monitor for ~$450. (note: NOT 4K) The only place that's got any competition right now are the smartphone screen providers. If it were price per pixel, the 4k monitor i drooled over a few pages ago would be outrageous if the same price-per-pixel made its way into phone land. The past 2 years have been the industry orgasm-ing over HDTVs and it's only now that it's starting to fade. The size of the 23" monitor i bought on black friday fits me JUST fine, despite desperately wanting 4k res.
But also, that Dell is for artists and extreme enthusiasts like Freelancer who spends $5,500 per computer. It supports adobe RGB and sRGB color ranges, so that's also a part of the reason for the increase in price. (note that the lag is 8ms, which may create a ghosting effect while gaming due to the enhanced color precision)
http://i69.photobucket.com/albums/i8...ps0350b9e2.jpg
Maybe a Power supply problem?
details tab
You know, my monitor is also supposedly 6 ms GTG with 12 ms typical, 16 ms tested by Anand. If it's ghosting, I'm usually too busy having fun or oogling at the visuals to care. A bigger issue is the input lag that results when you enable VSYNC. In Battlefield, it's enough to get you killed.
You can get non-DIY 2560x1440 screens for less than that. For example, this.
nope.
It was a while back because i had a warped fucked CPU heatsink and fan, but i now run a MASSIVE cpu fan, which keeps it at 35 degrees under load.
No more milk carton?
That thing was legendary.
That's what I'm talking about. I always equated the ms with response time between updates with hertz/refresh
My old samsung syncmaster had an advertised 75 hertz refresh rate, but when we went to the full Rez of the monitor of 1900x1200 it would drop to 60. False advertisement, or misleading, rather.
What sucks about all of this, is that 4k monitors in less than 27" screens are only making a comeback JUST NOW those monitors are $1,200 on eBay currently, and were released in 2001. TWO THOUSAND AND ONE. WE HAVE BEEN ABLE TO HAVE 4K monitors for 12 FREAKING YEARS. IM PISSED.
And what did you want to do with 4k Monitors in 2001? Watch a DVD? Play a game with your GeForce 8600 GT? ^^
I cannot understand why you one want right now either. In the future maybe yes but now?
There were no 8600 GTs in 2001. Radeon 9800 Pro and Nvidia FX 5900 Ultra were in vogue, though. :)
This has nothing to do with that. I'm pissed because we've moved backwards, not forwards, in monitor screen-tech. It's akin to going from an octo-core cpu back to dual core, or from 28nm architecture to 56 nm or larger.
See above.
Or going from having a multi-window workspace to being limited to two windows...
Sorry idgi. If I am completely missing the point, would you please elaborate?
Where have we gone backwards or do you mean we have not-gone-as-fast-forward in terms of screen resolution? 3nm Transistors already existed back in 2006, so being at 28nm Technology right now does not mean we have gone backwards...there is a lot more to a product to be commercially available than just "having the technology to do it".
Foremost is the demand and for instance for me, 4k is not really interesting right now because it also does not seem like a leap forward in technology than just another "nice-to-have". My demand for a new monitor (I have been looking for one) was just 1200p (because I prefer 16:10) and this is more important: Being pre-calibrated with 100% sRGB coverage. I have a dual-monitor setup because I often work with 2 windows simultaneously. If 4k would allow me to do this conveniently on one screen I would welcome and consider using it, I am just not sure it does.
Also just to give you an example from another tech branch: Do you know how many Megapixels the top-of-the-line $5500 DSLR from Nikon has? 16. They have cheaper DSLRs with up to 42 but they decided to use 16 on their best camera. There is always more to something than it might appear at first and in some cases, less is just more.
I'm upset because, I honestly think the average standard for a computer monitor that's sold at a place like best buy should be a minimum of 1440p. But no, there are still 1080 models. 1900x1440 is the high-end that's on display, and TV's utterly demolish the computer screens in display area.
Sure, it's great for you- the IBM T220 and T221 were commercially available in mass production in 2001. More than that, the reasons why we've gone backwards, or stagnated, are more dumb than i'd like. There was a time when computer monitors murdered TV's in terms of quality (and a lot of the times, they still do), but since the hd tv hype has subsided a bit, we've gone not just down from the T220+221, but we've stayed at the same exact ratios as TV's ever since the entering of 'hd' as a television standard.
2560x1440 is 1440p.
Where we went backwards on PC displays is moving from 16:10 to 16:9 and from having the common 14" and 15" display resolution go from 1440x900 to 1366x768. It's atrocious and, while I'm glad to see this resolution is fading away, the market is still over-saturated with garbage computers so equipped. Why? All because panel vendors could simply scale production of their 720p panels to fit the PC market, the cheap fuckers.
1600x900 is also a step down from those 1680x1050 16:10 monitors that used to hold their position on the market, but since 1080p monitors have become so inexpensive I don't mind as much.
Also, Dell's replacements for the U2711/U3011 are shit. I got a hell of a lot more for $900 with my U2711 than you get for the same price out of the U2713. Way more connectivity, CFL backlighting be damned.
e: warsaw posting too fast. this is @dwood
Well you should probably not compare apple with oranges; for obvious reasons TVs have a bigger display area and we stuck at this aspect ratios because they are the golden ratio after all. You also should look more at the back-end of things: The content which will be displayed is surely as important than the display itself. I know in America they are further ahead with HDTV but over here we just barely switched over to it and it is still just picture with 720p which then gets upscaled to 1080p so why should people now buy 4k TVs if there is no standard for it? No console, no BluRay and no TV Channel can output that resolution.
To me, display development has gone into the wrong direction. Everyone just wanted to have a smaller response time (which is just gamer elitism really...), a higher resolution and a bigger screen, the usual "bigger is better" mentality. Why not focus on stuff we all would benefit from for instance pre-calibration so we can all -this sounds very banal- at least see the same picture, which after all, is the point of a display? Why own a huge car with 500bhp if it cannot get you from A to B.
I welcome every new form of technology but if 1440p or 4k means down-scaling my games so my gfx card can handle them, to upscale movies to fit the resolution and Windows not being practical at doing multi-window work on one display, I just don't see the need for it and I wait for the point in time when there will be technology to truly support this.
@warsaw 16:10 supremacy.
On a side note, maybe I am just not experiencing these controversies. My Computer has 1200p, my Laptop has 1080p, TV has 1080p and my smartphone has 720p. Pretty much all standard.
I have no life...actually I do, but I'm ignoring it at the moment.
The size ratio and resolution issues have impacted laptops more than anything else, since the displays are integrated. Sure, you could always get 1920x1200 or 1920x1080, but these are generally regarded as premium options and were so even on laptops 17" and up until fairly recently. Even then, you still paid for it. It's the mainstream end of the spectrum that really got the shaft. My laptop from 2005 has a 14", 1440x900 16:10 screen, standard. Even today, a 14" or 15" laptop comes with a 1366x788 16:9 screen as standard. This is unacceptable, and it's not like consumers had a choice or a chance to influence company decisions (I love broken economics) because the entire industry went that way except Apple, but Apple's 13" screen was 1280x800, so it wasn't not really any better.Quote:
On a side note, maybe I am just not experiencing these controversies. My Computer has 1200p, my Laptop has 1080p, TV has 1080p and my smartphone has 720p. Pretty much all standard.
This is of course true but one should not make things appear that simple since there are lot of factors that go into the equation.
This is also a reason I don't like considering Moore's "law" as a law, it is just an observation in a short time span. What should be a law is, that technology will stop when it meets with human limits. In case of resolution when we reach the point, where our eyes cannot differentiate between resolutions anymore. When harddrive space gets so big we cannot put enough data on it because our lives are to short to view of all of it. When CPUs are that fast that human reaction is the bottleneck. When the DPI of your mouse is so high, the movement of a muscle always is too rough for precision.
And I think with resolution we soon will have hit the limit, on an average computer screen (24'? , also a human limitation obviously) there is more potential for pixel density right now but I doubt the difference between 4k and 8k will have any significance. To be able to use more screen space you would have to make everything smaller until you cannot read text anymore (another human limitation). On a TV this is obviously another story because of the screen size / pixel density.
This is a topic where we have a lot knowledge of / daily practice that is why it is hard to debate but if you take a look at other technology branches where there are similar discussions (which are far worse tbh, this here is pretty factual) you will find these human limitation much faster. Can you see the difference between a 16MP or 36MP picture printed out? Can you see if the picture was taken with a 16-28 Nikon 2.8 ($2800) or a 18-105mm 3.5 ($150) lens? Can you hear if an electric guitar was recorded on a solid state amp or a valve amp? For you this seems a small difference but people make a HUGE deal of it (and pay a lot of $$$). Would you pay 10 times the money for a guitar amplifier, something you can barely hear the difference? I did just because of the "bigger is better" mentality :P
I just brought these examples up because I think these topics can be very well compared to computer technology.
Hmm well it is difficult to say where the diversity comes from, maybe it is just a time where companies compete with form factors until it it balances out and a standard is established? Or it is the gfx card which is not very powerful in netbooks or it is like I said, you don't see any difference or screen space would be too big / items too small. Do people even care? Would you care if you don't know the individual screen resolutions? Progress comes with the need of improvement. Laptops are either too heavy, too slow, too large or battery life too short but I never heard people saying, "the screen resolution could be a lot higher".
There are also people like my dad who do not the best eyesight and turn down resolution to be able to see text on screen.
I would notice not being able to have as many windows up or having to scroll more. My sister noticed these things when she went from her 14" with the 144x900 resolution to a Lenovo U410 with the garbage resolution, and she's not a tech savvy individual like we are. Most people don't really know about pixels, but they do think a larger screen means you should have more work-space. Unfortunately for them, it doesn't work that way.
It's a shame DPI-scaling isn't very well implemented, because your dad having to turn down the resolution to read the text is a sad state of affairs for computing.
Being able to change DPI while keeping the UI size at a given resolution (and being able to change that as well of corse) would be brilliant idea!
It certainly is. Those are the kinds of features we need in computer displays, so our ui can actually scale with increasing pixel density. It sounds like the concepts of vectors. Though, the os or graphics drivers would need to know either the dimensions of the display or the pixel density as well. That way, size could theoretically stay the same but the overall details could be more crisp, too.
Let's look around and see if there are any solutions to that question/problem.
Pfff, that's nothing a few questions at set-up can't fix. And a vecor-based UI would be perfect.
Watch this and feel inadequate!
http://www.youtube.com/watch?feature=player_embedded&v=ihh3yKnnPO0
As someone with a $5,000 system.
Challenge accepted.
I haven't had TIME to sit down and figure out what customer water-cooling stuff I'm going to put in my system yet... I know it's probably going to total about $800 - $1,000 based on other builds I've seen. The reason I got this case was for the water-cooling upgrade potential it has, would be a shame to never put that to use!
I'm probably going to pickup another used Titan in the future as well, because I already have the triple-SLI bridge and the board can do PCI-E 3.0 x8 on 4 lanes just fine, so I have the bandwidth... But not until I need it, because I'm running BF4 on Ultra at 5760x1200 without any issues, and every other game on max settings at the highest resolutions they can do, and getting solid 60 FPS with adaptive VSYNC.
You need more screens, just in case you decide to turn your head
I'd recommended not to bother waiting but it's difficult to predict the future.
RAM Clock speed barely affects performance. It will take some time until price / performance for DDR4 will be right. Right now you are just better off spending your money on DDR3.
Just get yourself a good GFX Card with CUDA cores and you can enjoy hardware acceleration in Adobe products.
Okay, then. I upgraded from a laptop with dual-core processor and integrated graphics, that could only play halo pc on low settings. It's not that desperate, i hope?
The motherboard is really the key for any computer's upgradeability. If you don't mind waiting, what you CAN do in the meantime, is purchase a nice gfx card and a super-nice psu, then save up some more, buy a nice new case, wait/save up a bit more, etc. There are no motherboard announcements that I have seen for DDR4, so your computer had ought to be fine with DDR3 for the next year at least. Make sure, for upgradeability, you buy the NICEST mobo you can, as well as the NICEST PSU you can, and go from there. You can port old hdd's, cd drives, even the case, until you're ready for a new one.
If you were buying a new computer, and there were at least announcements for high-end motherboards in DDR4, i would say wait, but Crucial's the only producer that's announced rolling out any DDR4 chips. No point
If you're going to be building a new PC within a year, then you'll definitely want to wait and see what DDR4 pricing is like half a year from now. If this is any indication, you definitely want DDR4. It's just too much of a step up from DDR3 if it is affordable.
OpenCL on AMD will do well, too: http://www.tomshardware.com/reviews/...o,3208-13.html
CUDA != OpenCL. Both ATI and Nvidia use OpenCL but some things are CUDA accelerated. Not that it makes a huge difference because only some 3rd party plugins use it but if you don't care and the price is right you might as well go with CUDA.
Some native plugins in Premiere use CUDA which are marked.
On the flip side, if you plan to mine any cryptocurrencies stay far away from Nvidia.
Depends on the currency. LTC is doable. BTC? Ha, no.
I need an uninterrupted power supply after all these power outages in the GTA. Someone recommend me a good, basic one for less than $100 CAD.
Sorry, got nothing to recommend... but, in other news:
Dell planning on a 28" ultra-high-definition for sub-$1000:
Look here about halfway down.Quote:
Coming Soon: Dell 28 Ultra HD Monitor– Expected to be The Industry’s Most Affordable Ultra HD Monitor
The Dell 28 Ultra HD Monitor will be available in early 2014. Offering the same incredible Ultra HD screen performance as the Dell UltraSharp 32 and Dell UltraSharp 24 Ultra HD Monitors, but priced at under $1,000, this 28-inch monitor can help boost user productivity with its multiple adjustability features, including the ability to pivot to portrait mode, plus multi-task applications. The energy efficient monitor has multiple input ports that allow users to display content from smartphones and tablets on the larger screen, and conveniently connect laptops, PCs and essential accessories. Dell expects this monitor will be the most affordable Ultra HD monitor in the industry when it is launched.
!!!!
I'm still waiting for this curved prototype they showed us to go into production. I would buy the shit out of this: