Saturday, April 12, 2008

the GPU is the new CPU

there's seems to be a lot of smack talk with the nVidia versus Intel shenanigans in the feeds lately.

Nvidia CEO "Not Afraid" of CPU-GPU Hybrids

CE-Oh no he didn't! Part LV: NVIDIA boss says "We're going to open a can of whoop-ass" on Intel


NVIDIA continues to hate on Intel, promises sub-$45 integrated chipset


basically, the VIA Isaiah + nVidia GeForce(5?) combo are pit against the Intel Celeron + G945. and really, the performance they are talking about is akin to my solar powered calculator.

however, i do like the idea of putting a full fledged GPU on the motherboard (and not just an IGP).

i predict that motherboard manufacturers will one day make mobos targeted with GPU specific chipsets just like how they are currently CPU centric. zif sockets for the GPU on the motherboard is an evolution that needs to happen and long overdue. the dedicated memory chips on a speedy bus found on all modern graphics cards is probably going to prevent this from happening. but hopefully, as the xbox consoles have demonstrated, a unified memory architecture will satisfy that limitation / requirement.

to offset the dedicated RAM + speedy bus loss, programmers will need to learn to make their code go "one-way" in a "fire and forget" style coding. set top box, portable media players, the playstation 1 2 & 3, etc., all perform very well when you push data through one end of the hardware accelerated "blackbox" and just let it do its thing spitting stuff up on the other side (normally the end destination as well).


but, what's most likely going to happen is:
the CPU will gobble up the GPU

in terms of raw number crunching, the CPU will become less important in pushing the performance bar while the GPU will dominate the clock cycles for painting and positioning pretty pixels. and i'm not talking about those office or other desktop apps, but as a graphics / multimedia combo system.

in light of all of this, i don't know why AMD/ATI hasn't made a run for a mobo with sockets for their CPU and GPU separately. it would have been quicker for them to push this out in to the market instead of them trying to complete their CPU/GPU combo first. and who knows how much that's going to cost.

just as the CPU has gobbled up the once separate math co-processor, this shouldn't be a surprise to see the clash of the GPU and CPU colliding. hopefully, the GPU will win this time, 'cause the plain box PC isn't exciting anymore. the cool stuff is all of the eye candy you see now-a-days on stuff like the latest OSX and compiz goodness. what? vista?!? you're insane...

i
the original human powered GPU
who bitch-slapped the CPU
into submission.

Tuesday, April 1, 2008

i HATE the RSS feeds on April 1st

oh holy jeebus, there was twice the number of RSS entries today and damn nearly all of them were filled with "haha, fools day" -- i felt like taking a $#!t on my reader.

the only saving grace was the bazillion Mr.T cameos from Gizmodo's posts.


i couldn't tell if the video card releases from both nVidia and ATI were real because i was p33ing on the screen.

all of the autoblog entries were useless, "cars now eco friendly made from TP and water proof." no, really, how about this: "Double-decker smart fortwo a little top-heavy":



be sure to see this to the end if you watch this
.

you can never tell which of the MAKE:Magazine projects are fake -- because, i mean, have you seen some of the projects they feature? none of them look like they would work... you know, ever since the first issue came out.

even the podcasts i listen to and video shows i watch are all in this f$@kfest.

crikey, i would rather them take a day off than me trying to scan through the piles of @$$ while scraping my eyes out with chopsticks...



BTW, i still Ars Technica, Autoblog, Gizmodo, Make:Magazine and Systm