[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vr / w / wg] [i / ic] [r9k / s4s / vip / qa] [cm / hm / lgbt / y] [3 / aco / adv / an / asp / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / qst / sci / soc / sp / tg / toy / trv / tv / vp / wsg / wsr / x] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/g/ - Technology


Thread archived.
You cannot reply anymore.



File: cell.png (532 KB, 793x740)
532 KB
532 KB PNG
is this true?
>>
so powerful it got replaced
>>
No it's not.

The cell CPU could do some very VERY specific types of math VERY efficiently, but it's retarded to think you can manage to program your game to utilize JUST those specific types of math.

The bottlenecks elsewhere made sure it was never going to actually outperform desktop class CPUs, even of it's own era, let alone today.
>>
>>74769252
then what if it didn't have those bottlenecks and was allowed to breathe?
>>
>>74769234
The only problem with the cell processor is that a core can't communicate with another one.
>>
>>74769267
Then it wouldn't be the same CPU at all.

Those bottlenecks are hardware, you'd need to make the chip physically larger to get rid of the bottlenecks, more registers, more cache, ALUs, etc.
>>
>>74769234
lots of shit is stronger than intels new PC, doesn't mean it's worth a damn in practical usage.
>>
File: 1568600052700.jpg (799 KB, 1954x2283)
799 KB
799 KB JPG
>>74769234
2600k smokes that shit, lmoa
>>
File: 1568614606507.png (1.01 MB, 1154x688)
1.01 MB
1.01 MB PNG
Cell was shit.
It was shit then.
It's even more shit now.
>>
>>74769272
Cell 2.0 on 5nm node with infinity fabric when?
>>
>>74769234
No. Of course not.
But it makes great headlines because the gaymurr crowd loves it and all the mindless Johnny Youtube's love to copy it.
>>
>>74769272
Just like Ryzen
>>
>>74769234
>guerrilla games
Imagine being such a retard you BTFO yourself by taking Sony bullshit hook line and sinker and keep repeating it for 15 years, each year becoming more and more of a joke.
I haven't been to /v/ in 10 years, but even when I left they were a laughing stop for repeating Krazy Ken's bullshit.
>>
>>74769332
*stock
>>
>>74769234
Pull up a chair, crack open your Monster Energy can, and inform yourself.

https://www.youtube.com/watch?v=zW3XawAsaeU
>>
>>74769297
Call me when you get TLoU working on a 360 anon
>>
>>74769234
Too much of a brainlet to say if it is true or not but it stands to reason that if a PS3 could be emulated full speed on a a modern Intel CPU than the cell has to be the weaker CPU
>>
The Cell was hyper efficient at certain tasks that utilized the SPUs properly. If the task relied on the main PPE then it would perform worse than a Xbox 360.
>>
The Pentium D, the one where Intlel glued two Pentium 4s together with some shitty interconnect destroyed the 360's triple-core Xenon, which in turn destroyed the PS3's meme CPU over its entire lifespan.
>>
File: 1563253546150.jpg (127 KB, 1238x713)
127 KB
127 KB JPG
>>
>>74769428
kek
>>
>>74769286
>>74769294
>>74769297
>>74769305
>>74769332
cope
>>
>>74769401
no it didn't
>>
>>74769332
I would imagine they have a bunch of demoscene guys considering they're in europe so it's not hard to believe they could make cell shit all over intel cpus
>>
>>74769234
Half true, it's like saying Intel's CPUs aren't as fast as a GPU for parallel computing tasks.
Of course, the juicy headline is everything though.
>>
Assuming whatever software you were trying to write could be efficiently coded for Cell, it would take many times longer to write it for Cell than to write it efficiently for any other chip.

It can actually get to the point where so many years have passed that by the time your software is functionally complete, there are better CPUs available.
>>
>>74769234
Very untrue, since even a mid-tier CPU (4c 8t) can emulate the PS3 now.
>>
>>74769234
Why did the Nips fall for Cell? PS3 was shackled more by the tiny RAM and quickly obseleted GPU
>>
>>74769930
nvidia fucked them over like they did M$ and nobody worked with them after that
>>
He's just outright wrong.

https://www.sisoftware.co.uk/2018/11/21/intel-core-i9-9900k-cofeelake-r-review-benchmarks-cpu-8-core-16-thread-performance/

The 9900k exceeds 230 gflops in practice, which was the Cell's theoretical limit.

>>74769844
Yea, cell became pointless a few months after its release thanks to Nvidia inventing GPGPU. What an utterly pointless article.
>>
>>74769234
Ah the cell cpu, the biggest lie and propaganda tool of all time, it's 2020 and yet the myths about the "potential" never stop.
>>
youtube.com/watch?v=Z6QYN4CqQXM
2stronk can't handle it!
>>
>>74769234
guerrilla is a bunch of SJW faggots and trannies

PS3 is inferior to Xbox 360 as the multiplats show
>>
>>74769951
No, you're talking Trumpstyle.
IBM was and perhaps still is selling the same architecture hardware to enterprise and scientific customers. Even today, it's far superior to GPGPU as presented by NVIDIA in many but not all aspects and this fact is never going to change.
The main problem with the Cell architecture was simply IBM's and Sony's failure at delivering tools for developers, needed for properly making good use of the hardware.
>>
>>74769401
>being this retarded
pentium D can't run a single game of the 360/Ps3 era
>>
>by using some arbitrary metric that literally nobody uses which includes the feelings of the manufacturer
>>
>>74770063
a celeron could run oblivion better than ps3
>>
>>74769385
The cell PPE simply was 2 IBM Power architecture cores connected to the SPE's via a XDR and EIR bus. The PPE instruction set was *identical* to the one found in the xbox360.
There is even books in existence about how Microsoft profited from Sony's investments by means of hiring IBM for the x360 R&D.
>>
>>74770077

No it couldn't, hell a full P4 ran the game like shit. You needed a K8 to run the game proper in 2006.
>>
>>74770062
>IBM was and perhaps still is selling the same architecture hardware to enterprise and scientific customers
It's almost all enterprise. Their primary product is enterprise systems and storage solutions. And it only brings in like $7-8 billion

>same architecture
No, but since SIMD vector extensions are common now all CPUs have similar hardware at a very high level. That doesn't change the fact that in terms of throughput GPUs are closing in on 2 order magnitude more powerful than any CPU IBM or Intel have to offer. It's not in the same realm for parallel processing
>>
>>74770129
You are fucking retarded. Oblivion was purely GPU bound. Literally any CPU from that era ran it perfectly.
>>
>>74769484
What are you talking about, it got more exclusives than 360, so unless you were a hardcore Halo fan and didn't care about the hardware difference PS3 was better with the free online.
>>
>>74769234
Based. POWER still has all the Power.
https://www.raptorcs.com/
>>
>>74770375
How does it compare to 9900k and ryzen 3900x?
>>
>>74770202
>2 order magnitude more powerful than any CPU IBM or Intel have to offer
intel is developing oneapi and part of it is DPC++, they realize that multithreading programming have to be the default otherwise people won't use it.
C++ is picking up more and more parallel algorithms in the standard lib and hybrid or green threads are all the hype thanks to functional programming and go, I'd say we should expect massive efficiency improvements in the software stack in this decade.
gpgpu is mature enough now and we sart to backport things now that 8+cores cpu are everywhere(phone, laptop, desktop, console)
>>
>>74770410
Yet there still is a need for ASICS and FPGA's. I know it's apples versus oranges, but in the same regard the architecture would have been unbeatable in some regards if they had chosen to continue development; but they did not. Never the less, most people agree that it was a marvel of engineering at it's time; one sometimes still being a source of inspiration.
>>
>>74769428
macintosh is the only accurate one
>>
>>74769267
Then instead of a PSgrill you'd have a PSoven.
>>
>>74769234
No, it was a horrible architecture. The Cell actually has less performance than even the 360 for normal processing, the 7 "cores" that everyone memed back then are just shitty extra compute units that are very hard to program for
>>
>>74769234
https://www.youtube.com/watch?v=zW3XawAsaeU
>>
>>74770062
IBM isn't selling anything. They've been outpaced by everyone else and aren't even trying to peddle their shit hardware anymore. Their company survives purely off the licensing and support of their past customerbase. If you think anyone bought hardware from IBM in the past 5 or 10 years for it's performance then you're delusional
>>
>>74770845
>They've been outpaced by everyone
Their Power Systems outpaced EYPC Rome in IO before Rome even came out
>>
>>74769234
>of course it is goy! and were going to sell it to you for a quarter of the price because we like you so much!
>>
>>74770531
>most people agree that it was a marvel of engineering at it's time
The ageia PPU came first with a similar concept, and it actually probably directly influenced the evolution of GPGPU when Nvidia bought Ageia
>>
>>74770375
Wish it was cheaper. I'd buy one in a heart beat.
>>
not even a bit
RPCS3 exists for a reason
>>
>>74769322
kill yourself intcel
>>
>>74769352
>MVG
good taste
>>
>>74770077
why bother lying
i don't understand liars
>>
File: sigh.jpg (80 KB, 800x450)
80 KB
80 KB JPG
>>74770355
>doesn't recognize PS3 HAS NO GAMES
how fucking new are you? lurk a little bit longer before posting, holy fuck
>>
>>74770375
>even IBM has Pcie 4.0
>Intel still doesn't
>>
>>74769252
What specific math?
>>
File: 1b77pc4c2cl21[1].jpg (61 KB, 812x1024)
61 KB
61 KB JPG
>>74769358
>console exclusive proves hardware is better
>>
>>74769234
cell was a piece of shit
>>
I'm sorry but does anyone actually believe a 12 year old CPU that no one, not even IBM, supports anymore is better than a contemporary CPU?
>>
>>74771662
Floating-point arithmetic, though only single precision.
>An SPE can operate on sixteen 8-bit integers, eight 16-bit integers, four 32-bit integers, or four single-precision floating-point numbers in a single clock cycle, as well as a memory operation

In contrast, a modern x86 desktop CPU can work on multiple separate instructions per clock, not to mention branch prediction and double precision FP. As well as having much higher overall IPC in general at this point in time.
>>
>>74771839
so it wasn't good for anything then
>>
>>74771845

It was very good for parallel processing for a few years.
>>
>>74771766
A 12 year old GPU can be better on some measurements than a CPU from today. The Cell is some weird middle ground so I wouldn't be surprised if it competed with new hardware on raw throughput in some instruction. History definitely shows it was a bad idea though. CPUs for narrow work and GPUs for wide work is the most useful configuration.
>>
>>74771845
i mean, it was very good at specific things, but yeah overall it was nothing AMAZING.

It certainly isn't better than modern CPUs though.

For example, a 9700k gets around 70GFLOPS in single precision floating point operations per second.

The cell processor theoretically can do 25.6GFLOPS per SPE, the PS3 cell processor had 8 SPE's so that's THEORETICALLY 200GFLOPS.

But modern tasks that require higher floating point performance use the GPU instead, as GPUs are INSANELY good at this type of math as well.

Just for an example, a mid range GTX 1660Ti does 4608GFLOPS of single precision.
>>
>>74771952
>>74771845
hell, even the iGPU of a 9700k does 460.8 GFLOPS of single precision. More than double the theoretical maximum of the PS3's cell processor.
>>
>>74771952

Actually, in order to increase yields, PS3 only ended up with seven SPEs. And it wasn't like the Athlon IIs where locked cored were only locked through software, if a chip managed to get all 8 working they'd laser cut it.
>>
>>74771995
So it's even worse with a theoretical maximum of only around 179GFLOPS.

Sad
>>
>>74769267
what if we made good hardware instead of bad hardware?
>>
>>74769359
>it stands to reason that if a PS3 could be emulated full speed on a a modern Intel CPU than the cell has to be the weaker CPU
well since that hasn't happened outside of main menus i take cell was indeed the superior one???
>>
>>74769234
>>74769297
>>74769332
Sony's been doing this for years.
Have people forgotten about the PS2 "emotion engine" BS?
>>
>>74769234
The Cell CPU can perform less than 300 GFLOPs at theoretical maximum. Intel UHD's start at 600. Even if you put 32MB L3 cache in it, the damn CPU would still suck dicks.
But hey this doesn't stop hasbeens who spent their entire careers microoptimising ops for this flopped CPU to dream of a better future.
>>
>>74769841
They couldn't even if they tried. Any theoretical performance limit you can cite puts it slower than first gen i7s by any metric, and those were pretty shit.
>>
File: 4.2million.gif (1.35 MB, 500x282)
1.35 MB
1.35 MB GIF
>>
>>74769322
huh?
that's what ryzen is great at though
do you are have retard?
>>
>>74771647
back to v
>>
File: hqdefault.jpg (7 KB, 480x360)
7 KB
7 KB JPG
>>74769234
interlinked
>>
>>74769267
>then what if it didn't have those bottlenecks and was allowed to breathe?
Then we wouldn't be talking about PS3's Cell CPU anymore? Because it certainly has them.
>>
>>74773086
you can play demon's souls on a decent pc
>>
Go back to /v/eddit.
>>
>>74773086
It has ?
Many are playable now.
>>
>>74771658
>even
IBM makes excellent processors.
>>
>>74773086
>gow3 on ps3: 30fps
>gow3 on rpcs3: 40fps
interedusting
>>
>>74769322
lmao.
>>
>>74773137
marketing hyperbole will never die
back in the day I would argue with fanboys on usenet about how nintendo's "project reality" was just marketing horseshit and there was no way in fucking hell that some $250 console was going to contain any serious tens-of-thousands-of-dollar SGI hardware.
and I argued for hours with those faggots that all the graphics for the Killer Instinct arcade were prerendered. but noooo nintendo was going to put a high-end graphics workstation in every home!
>>
imagine mining crypto on cell.
>>
>>74773334
except they did

The Nintendo 64's central processing unit (CPU) is the NEC VR4300 at 93.75 MHz.[66] Popular Electronics said it had power similar to the Pentium processors found in desktop computers.[44] Except for its narrower 32-bit system bus, the VR4300 retained the computational abilities of the more powerful 64-bit MIPS R4300i,[66] though software rarely took advantage of 64-bit data precision operations. Nintendo 64 games generally used faster and more compact 32-bit data-operations,[67][self-published source?] as these were sufficient to generate 3D-scene data for the console's RSP (Reality Signal Processor) unit. In addition, 32-bit code executes faster and requires less storage space (which is at a premium on the Nintendo 64's cartridges).

source: wikipedia
>>
>>74773203
>>74773218
>>74773238
I'm pretty sure the emulator uses GPU for rendering.
>>
>>74773424
It's honestly amazing games ever worked on these platforms. Look what Devs go through now on thoroughly documented platforms with high level languages and tonnes of iteration. Then consoles launch requiring their own platforms and no-one could patch after release and somehow stuff mostly worked.
>>
>>74773454
Games were a lot more simple.
>>
Sony underestimated how stupid and lazy most developers are. The console was way ahead of its time, but due to how actually using the SPU properly required you to put in effort, it was a lost cause. That's why the exclusives were so amazing for it. As far as I know, Japan has a culture where you actually value and respect your work. Unlike America where you just shovel out shit as fast as possible to get your stock price up.

It just ended up showing people which developers were incompetent. Valve included.
>>
>>74773424
except it looked and played like absolute dogshit compared to even a midrange SGI of the era

source: my fucking eyeballs
>>
>>74773503
>takes a while to learn transferable skills in software dev field
>release a system that will never be used anywhere outside of gaming and for which you have to relearn everything you know
>people don't want to learn it
yeah how could anyone have seen this coming
>>
>>74769234
I really doubt it.
>>
>>74773503
>make money by making and releasing games on a fixed budget in a reasonable amount of time
>new console cpu is very powerful but you have to rewrite all of your algorithms, spend extra time you don't have to master its peculiarities, while also trying to make your code portable enough to be released elsewhere so you can, like, turn an actual profit and continue to make games
>only entity really profiting from such deep, esoteric optimization is Sony
>get called lazy because you don't orbit your whole fucking professional existence around a single manufacturer's hardware choices
OK
>>
>>74769908
Well, you could say that, if the premise that the ps3's cpu was indeed faster or as fast as today's top of the line cpus (which is not), still they could not squeeze as much power as they wanted because it was difficult to achieve. Not saying that that is true of course, but multiple devs did confirm that their games could had performed better with more research
>>
Did you know that the n64 could run Crysis if it weren't by the 64mb cartridge limitation?
>>
>>74773503
It's ironic because Japanese developers don't give a shit about graphics or complex simulation anymore. All of their games use boilerplate technology. The innovative techniques all come from the west.
>>
>>74770375
>Don't trust us? Look at the secure boot sources yourself — and modify them as you wish.
based
>$1636 for a "lite" board and a four-core
fugg
>>
>>74769234
It’s Stockholm syndrome.
There’s a whole bunch of former devs on twitter who sometimes unironically fawn over the Cell.
>>
>>74773398
https://github.com/verement/cellminer
>>
>>74769297
Why is MS so bad at making their own CPUs? They just copy whatever Sony goes with, but make it cheaper.
>>
>>74773822
????
MS and Sony both outsourced the designs to IBM. Neither company has any microchip R&D department. Respectively, the Xenon and Cell CPUs.
>>
>>74773684
If you care about security that much, you'll pay any price. People buy $3000 gaming PCs. For $3000 you can get a 4-way SMT 4-core Power9, that's 16 threads.
>>
>>74773503
Bullshit Sony/IBM released 0 documentation on this obscure one-off trashy CPU. A lot of them made late PS3 games look like early PS4 ones at lower res, so no the developers adapted perfectly to this malengineered maladvertised rock.
It's like if China released their own console on some XinLinPin 43-bit MIPS 2.0 that runs one stick of DDR4 and one stick of DDR5 concurrently, with no documentation, optimising compiler, or intrinsics.
>>
>>74769234
It's basically a GPU chip so while it says it's more powerful, realize it isn't relatable.
>>
>>74773825
> outsourced the designs to IBM
Rather
> Sony: let's ask IBM to make our chip
> MS: according to our insiders, IBM makes PS3 chip
> IBM: y-yeah guys, we have just the thing for you if you need it
And so MS went for the same uarch, then, when Sony has cut ties with IBM for making a backstage deal despite Cell development funding, MS went with the same manufacturer as Sony, again. Only this time, Sony didn't fund PS4 CPU creation and let AMD do their x86 Jaguar uarch.
>>
>>74769930
>quickly obseleted GPU

You should remember that early designs for the PS3 were going to omit a dedicated GPU entirely and just let the CELL take care of everything. Sony was fucking mental when they were still on their PS2 high.
>>
>>74773908
>the same uarch
My friend...
>>
>>74773822
reason is simple, money.
If you know exactly what kind of CPU your opponent has you can make one that is similar in performance for less money. This is industrial espionage 101.
>>
>>74773437
The PS3 also used a GPU for rendering brainlet
>>
File: GAMECUBE_PAL.jpg (88 KB, 1200x850)
88 KB
88 KB JPG
I miss the good old days.
>>
>>74774029
resident evil 4 still looks the best on a gamecube
>>
>>74769428
>pc and linux
>self explanatory

>xbox360 explodes

>wii is a toy

>mac is for fags

>ps3 has no ammo
>>
>>74771666
You fucking retard, imagine missing the point this hard. Call me when you find a game that looks and runs as well as TLoU on a 360
>>
>>74769428
XB360 is more like a taurus pt92 in the way it fails and is simply a poor man's poor man PC.
>>
>>74773437
>i'm pretty sure the emulator uses gpu, just like the ps3
interredusting, but i think your hole is already deep enough, m8
>>
File: Untitled.png (469 KB, 638x440)
469 KB
469 KB PNG
>this tech is actually very advanced, please pay top dollar for it so we can get rid of all this inventory

Nice try (((sony))).
>>
>>74769428
Wii should be a guy holding a stick and shouting bang.
>>
>>74773911
Several people who worked on the PS3 have said that dual Cells were never part of the plan.
>>
>>74769234
Are you kidding me? Is that a citation? It reads like a 14 year olds assay
>>
>>74769234
not really
the cell was originally supposed to be used without a gpu, so it might stand a chance at being faster running say, a video game using _only_ a cell vs. a modern cpu (software rendered, not even igpu)
but the SPUs are too domain-specic afaik to directly compare to a general purpose x86 cpu
>>
>>74773634
There has not been innovation anywhere in years. Everything is just rehashing old concepts with brute force.
>>
>>74771952
But why did they decide that a CPU needed that kinda floating point performance? Why, didn't they dump it to the GPU (i mean historically speaking, didn't the GPU have higher floating point mayh capability)
>>
>>74771995
https://www.psdevwiki.com/ps3/Unlocking_the_8th_SPE
>>
>>74774476
It's the same motivation for AVX-512. There are a few applications that can make a very compelling case for it but the feature ends up disappointing when you see the other 99% of applications that can't use it.
>>
>>74774029
Console wars were a lot more fun when hardware and features were so wildly different between the competition. Systems had some real pros and cons to them, you could really feel the difference between a GC, PS2 and XB game. Now in the 8th gen it's just "X has more vram than Y" and it's boring.
>>
>>74774505
things were getting rather subtle by the ps2/gc/xbox generation, previous generations were much more wildly different
there are certainly things which separate ps2/gc/xbox games though, moreso than later generations
>>
>>74769234
>So powerful Square Enix stopped supporting it after one XIV expansion
>>
>>74774476
>But why did they decide that a CPU needed that kinda floating point performance? Why, didn't they dump it to the GPU (i mean historically speaking, didn't the GPU have higher floating point mayh capability)
It was supposed to do GPU tasks. Remember back then there was no GPU compute at all. Before the PS3 launched they had all kinds of crazy ideas, the plan seemed to be to be able to combine multiple cells to make things more powerful, since it was a homogeneous cpu+gpu type device. Something like the ps3 would have 2 or 3 cells, if you bought a sony tv with cells in it, it could combine it's computing power with the PS3 to allow better graphics.

If you look at some of the late PS3 games you could sort of see how it might have worked. If you have a ps3 + cell TV the ps3 could render the raw frame, send it to the TV cell which could do edge detect AA and various post processing filters, freeing up the PS3 SPEs for more GPU type work.
>>
>>74773152
>Intel UHD's start at 600
lol maybe if their drivers actually worked
>>
>>74773334
judging by what I've seen of those SGI workstations on youtube, I'd much rather have an n64.
>>
>>74773594
no but it was going to get a port of unreal which has much more soul
>>
The Cell was like an early attempt at an APU. Modern APUs are faster and more efficient.
>>
>>74775764
No
>>
>>74774125
>>ps3 has no ammo
look again
>>
>>74769234
This CPU gets all the defects of the pentium 4 and multiply em by 10
It's a huge bipeline design WITHOUT a branch predictor.
Unless you're doing shader-like shit, this fuck will just keeps tumbling on itself and delivering sub Wii (per core) performance.
>>
>>74769428
Linux being disassembled from a single engineered system is misleading. More like a box full of random gun parts plus a shop to try to weld/machine them into a usable whole.
>>
If flops matter why don't you run Windows on your GPU.
>>
>>74769267
Then it would still only have one integer pipeline but be overkill on floating point performance. Game logic calculations and render command submission would still suck, but you could offload certain effects or physics from the GPU to the CPU.
>>
>>74776516
as long as the program is aware of the GPU, it will have significantly higher FLOP performance than a CPU.

Even basic bitch pentium's and celerons with the UHD605 or HD610 will outperform the cell processor in the PS3.

Intel's highest end iGPU, the Iris plus 650 gets 800GFLOPs+ single precision, which is more than 4x the GFLOPs of a PS3 cell processor.

Lets not even bring up actual gaming GPUs like the RTX 2070 which has 6497GFLOPs of single precision performance. Over 36 times the power of the PS3's cell processor.
>>
>>74769234
Depends on the workload.
>>
>>74776597
meant for >>74776491
>>
>>74773179
newfag doesn't even crossboard oh god
>>
>>74769234
Of course is better than today's Intel CPUs. Almost any GPU is better than Intel today's CPUs. For certain tasks.
>>
>>74774249
It wasn't going to be dual Cells. It was just going to be the Cell itself.

https://www.ign.com/articles/2013/10/08/playstation-3-was-delayed-originally-planned-for-2005

And yes, unfortunately, that is the original source because IGN conducted the interview where the information appeared. But that would've been a disaster in of itself with only a Cell so Sony had to go to Nvidia for a 7800 when they were already working on Tesla, which is a shame since the 8xxx series had CUDA and unified shaders and that could've helped out the PS3 massively.
>>
>>74777702
>But that would've been a disaster in of itself with only a Cell
why? might as well go all in
>>
>>74777737
Not enough power. The Xbox 360 would've shat on it instead of merely rivaling it. You can't just do all graphical stuff on the SPEs, you might as well go with a more powerful GPU if that was the case.
>>
>>74777753
not if they had the full fat cell instead of a cut down version
>>
>>74777761
There's too much ASIC shit on a GPU that would make the Cell look like a joke if compared to the 360.
>>
>>74776609
Running windows != number crunching
>>
>>74777821
once devs learned the hardware it would be fine
>>
>>74777761
That's only one extra SPE.
>>
>>74769234
>btfo'd by a literal ps3 cell cpu
OH NO NO NO!
>>
>>74773594
>source: my ass
>>
>>74777829
No amount of learning will compensate for actual hardware that is there doing shit too fast for the cell to keep up with.
>>
File: g70pipeline.png (117 KB, 500x321)
117 KB
117 KB PNG
>>74777761
1 extra SPE is 256KB of memory paired with 6 FP execution units and one integer execution unit, and was also dual issue, which limited instructions emitted. That's not really going to help much, even a low end 7100 from Nvidia would've helped more, let alone a 7800 GTX. And the thing is that the PS3 could've shipped fully enabled but they disabled one SPE for yields.
>>
Not only wasn't the cell very powerful, 2003-2006 was a terrible time to release hardware and expect it to be relevant. Hardware was evolving really fast, and 2005 PCs were obsolete by 2006. Up to 2008 CPUs and GPUs released each year were leagues ahead of previous versions (intel, amd, ATI, nvidia, everyone was at the top of their game). Then around 2008-2009 we hit kind of a plateau, from there until 2014 you could run any game with 2008 tech.
This is why a 2010 ps3 game looks atrociously bad compared to the same game on PC, but a 2015 PS4 game looks kind of similar to the pc counterpart.

This is my take on the subject, I may be wrong but I feel like that's what happened. 2006 hardware got stale by 2007 and obsolete by 2008, which didn't happen before or after ( PS2 hardware was stale by 2001, but it almost didn't have an OS which improved optimization)
>>
>>74774128
How about the last Tomb Raider game released on the 360 back in 2013?
>>
File: 30grof[1].jpg (231 KB, 1699x1920)
231 KB
231 KB JPG
>>74774128
>a shitty looking game
>proof the hardware is better
>>
>>74778199
Didn't look nearly as good as TLoU
>>74778219
>says the game looks shitty
>can't name one (1) example of a better looking and performing game on the 360
Definitive proof that wojakposters are brain-damaged
>>
>>74774128
Rise of the Tomb Raider, the second one of the new ones.
It was optimized as fuck for the 360.
Amazing graphic detail AND performance.
>>
>>>/v/
>>
>>74778264
>PS3 can't even do party chats
>"b b but muh cell"

top kek mouth breathing retard.
>>
File: 1569691885834.jpg (665 KB, 1200x1200)
665 KB
665 KB JPG
>>74769484
Got talked into buying a PS4. Found out most of the games I wanted to play were PS3 games.
>>
>>74773178
>Anonymous
>02/15/20(Sat)03:07:10 No.74773178 Anonymous 02/15/20(Sat)03:07:10 No.74773178>>74769322
>huh?
>that's what ryzen is great at though
>do you are have retard?
>>74771614
>Anonymous 02/14/20(Fri)22:47:38 No.74771614>>74769322
>kill yourself intcel

Lmao talk about hitting a nerve.
>>
>>74778264
>Didn't look nearly as good as TLoU
Rise of Tomb Raider from 2015, my bad:
https://www.youtube.com/watch?v=kXX31BFwFs0
>>
Why did they go with Cell and not Power?
>>
>>74778453
PS2 was a success and Ken Kutaragi wanted something similar to that design, a bunch of co-processors tuned for SIMD attached to a shitty CPU.
>>
>>74769234
Not in any useful sense. Even skylakex CPUs are tflops now days
>>
File: 1505759465846.jpg (114 KB, 1241x914)
114 KB
114 KB JPG
>>74769234
>is this true?
Fuck no it's not, you fucking imbecile. The Cell BE in a PS3 has a performance of roughly 100 Gflops. That's fucking nothing. A fucking 4-core Haswell @4GHz has a theoretical performance of something like 256 Gflops, and that is a 6 year old architecture.

STOP SMOKING CRACK
>>
PS3 required devs to know how to take advantage of the CPU and write good code to get good performance.

Xbox 360 didn't, and since most game devs are college shits of course they couldn't figfure out the PS3.
>>
>>74778540
The cell reached 210 GFlops actually, but it's irrelevant when we're talking about a retarded 40 stage bipeline CPU with no branch predictor.
It's like a Pentium 4 with brain damage.
>>
>>74769322
based
>>
>>74778540
>STOP SMOKING CRACK
No.
>>
>>74778542
PS3 had two GPUs and no CPU.
>>
>>74778163
7th generation consoles missed GPGPU by like a year or so, which is why it was a shame that PS3 got a 7xxx chip instead of 8xxx which was when Nvidia put CUDA on their chips. Also, graphics plateaued because most improvements after Crysis was really only in shadows/lighting and animations via digital acting.
>>
>>74769951
Add up the raw TFLOP performance of a cell processor including the vector units/graphics processing.

Then compare that to an Intel chip.

I don't know which one would win, but I do know Intel missed out on GPGPU performance. And yeah the vector cores on those cell/power9 chips were not easy to get code running on, even when they ran Linux.

The air force used to have a cluster of them.

IBM just open up there power9 architecture for licencing to third party fabs, so they might make a come back. Not likely but possible.
>>
>>74779080
There's 1/4 of a TFLOP on Cell on the best case.
>>
>>74769234
No, it isn't. It's fucking shit compared to today's CPUs. It was sort of impressive way back then. But that was very short lived.
It was a fucking pain in the ass to get any real performance out of it though.

People in this thread talking about "theoretical Gflops" on the Cell cpu are pulling shit out of their asses too. It had 6.4 Gflops per PPE at 3.2Ghz single precision x 8, which amounted to a theoretical maximum of 25.6 Gflops IF you could actually write code to harness that shit.

Nobody fucking could.

For comparison, a 4 core sandy bridge can do 10 times that with AVX instructions, single precision. Halve all numbers for double precision. Cell ain't all that shit, nostalgiafaggots.
>>
>>74774128
Crysis 3 and it looks better
>>
>>74778414
quote properly, intcel
>>
>>74778163
>2003-2006 was a terrible time to release hardware and expect it to be relevant
you know the ps3 outsold the 360 right? Seems pretty relevant, and now everything is heavily multithreaded because of it, so you should be thanking Sony for giving devs a head start on parallelizing their code.
>>
>>74780860
A GOOD multicore CPU would have been a lot better for the developers back then.
What Cell did was to neuter AI, gameplay and data compression because it was shit as a general purpose CPU and those things were too demanding for it.
They would be better off with a literal Athlon 64 x2 than that cell shit.
>>
>>74781002
then why were ps3 exclusives much more impressive than anything else at the time?
>>
>>74781039
Because it was a system with two GPUs.
But in terms of gameplay, AI, physics etc there was no evolution.
Basically the same shit except a few games that tried like red faction guerrila that run like absolute dogshit on the PS3.
>>
>>74781061
well it's not like any other system had anything better
>>
File: THE POWER OF THE CELL.jpg (146 KB, 1280x720)
146 KB
146 KB JPG
>>
>>74781066
Not in terms of AAA at least, because both consoles had absolute garbage CPUs, with the 360 xenon being just a crippled Cell with better intercore communication.
>>
>>74781125
there hasn't been anything better in AI on pc either
>>
>>74781140
Dorf is absolutely the anti-cell game.
>>
>>74769322
based brainlet
>>
File: power of the cell.png (120 KB, 817x1372)
120 KB
120 KB PNG
>>74769234
>>
>>74781140
That's because game AI is simple behavior trees. It's not like ML or anything, a monkey could implement it and in fact I have so that settles it
>>
File: n64 devkit.png (48 KB, 849x458)
48 KB
48 KB PNG
>>74773334
>>74773424
>>
>>74769234
If you slap tasks that a GPU would be doing at it, I wouldn't be surprised.
>>
File: latest.png (1017 KB, 644x1321)
1017 KB
1017 KB PNG
>>74777761
>the full fat cell
You mean the Perfect Cell?
>>
>>74778199
It's also on PS3 and doesn't look half as good. GTA V was a better game graphically, but it still doesn't come close to TLoU in pure graphics.
>>
>>74781703
but I thought it doesn't have anything at all in common with modern gpus which is why n64 emulation is so shit and based on glide
>>
>>74781665
>for certain workloads (one example is emulators)
stopped reading there
>>
>>74778271
Looks kinda like Uncharted 3 but with bad textures.
>>
File: 1572413700919.jpg (361 KB, 960x2472)
361 KB
361 KB JPG
>>74769234
games is about being FUN, not update zilions poligons per milisec
>>
>>74781853
It's based on glide because shit barely evolved since the early 2000's. N64 emulation is mostly shit due custom microcodes (something something programmable GPU magic) from what I've heard.
>>
>>74781665
oh yes, the internet guy that knows more about CPU architectures than both IBM and Sony.
>>
>>74783895
Just look at the GDC slides. It's not a secret. only good thing is its a shorter path to offload work into spes. this is a huge cost for modern cpus. pushing work into the GPU is only beneficial for big chunks. with Raising transistor count all they do is make fixed function units. Look at a modern Mobile SOC, 7/10 of the chip area is fixed function stuff. AVX512 is faster then a Quadro if the working set is small.
>>
>>74770063
It can run anything not late gen, are you retarded?
>>
>>74769267
Why didn't they just reprogram the synapses to work collectively?
>>
>>74783801
hardly any games used actual custom microcode, most just used the few off-the-shelf ones from the official SDK
>>
>>74769234
its like saying a honda civic is more powerful than a peterbilt, because its 0-60 time is better
>>
>>74769484
damn ps3 toddlers btfo, how will they ever recover?
>>
>>74769234
yes. there was a shortage of consoles at one point because clusters of ps3 were used for scientific calculation.
>>
A Dutch guy said it so it must be true

Also shintel
>>
>>74774190
I hate that channel
>>
File: file.png (39 KB, 604x453)
39 KB
39 KB PNG
>>74781864
I really wish people quoted academic papers and presentations for this kind of stuff. People who used the Cell in university knew about its shortcomings since 2007-2008. And game developers surely knew by this time the same thing.

http://www.netlib.org/lapack/lawnspdf/lawn185.pdf

http://cado.gforge.inria.fr/workshop/slides/bos.pdf
>>
>>74785581
>http://www.netlib.org/lapack/lawnspdf/lawn185.pdf
Someone actually wrote a paper on why game consoles aren't good for general use supercomputing, holy shit.
>>
>>74785596
It got quoted in 45 other papers too according to Google, which is quite respectable for something as quaint like this. But I've read enough papers to tell good from bad, and I would never recommend people to read stuff that doesn't have useful information in it not backed up by facts. The people who did research via writing code, profiling it, and interpreting it at least detailed enough to make it factual and trustworthy in my view. But yeah, you can get bad papers for stuff like this. Look at some third world journals like from India. I'm not disputing that they don't do great work from time to time but there are a lot of trash papers they produce.
>>
>>74769234
Like most PowerPC CPUs they are very powerful for their clock but are a pain in the ass to make anything run on. It's the reason why they have been mostly abandoned.
>>
>>74784613
Many of em slightly modified the standard microcode, or abused bugs and quirks of it.
>>
Anyone complaining about 'difficulty to code for' is a pajeet
Anyone saying it's false is a PCJew Steamshill
Hope Gayben is paying you all well
>>
>>74781864
Anything that use if statements on the Cell is painfully slow.
>>
>>74769322
kek
>>
>>74769322
Wrong but still funny
>>
>>74776247
Just because a magazine is inserted doesn't mean there's rounds in it.
>>
>>74769322
So when it comes to CPUs it geos

intel>ps3>ryzen right?
>>
>>74785581
>>74781703
>>74781665
What a load of rhetorically gentoomen befitting but intrinsically arrogant, & either subjective or factually incorrect pile of feces.
Go read https://en.wikipedia.org/wiki/AltiVec for starters.
>>
>>74769234
So this thing can beat 56 cores with avx512?!!? mind blown
>>
>>74787710
Stronker than the architecture found in those 56 cores indeed. But the comparison isn't honest because x86 carries almost half a century of legacy while the cell does not.
>>
>>74769908
eh it depends on the game. the more demanding ones run like shit with 8 threads. you need 12 or 16 threads minimum.
>>
The cell has a strange cult, it was hardly impressive in its time when compared to its peers. It was always useless for code that actually exists and was blown the fuck out by the first nehalem in vector code.
>>
>>74787627
Go read https://en.wikipedia.org/wiki/Pipeline_stall
>>
>>74787942
>because x86 carries almost half a century of legacy
Hasn't been a practical issue since P6
Legacy instructions don't exist in hardware anymore
>>
>>74783895
It's not uncommon for developers to find out more about the capabilities of hardware than the hardware manufacturers themselves. Hardware is designed for a relatively narrow operation, developers are monkeys trying to jerry rig everything into doing shit it shouldn't.

Case in point, PS4 developers invented asynchronous compute on GCN and it was later formalized as a feature for Mantle and DX12.
>>
If FLOPs were so important and determining for things like compute, then why doesn't Vega smoke every GeForce card in it's intended use cases and number crunching?
>>
>>74788359
Radeon VII
>real clock after this pos throttles
1.6ghz
>shaders
3840
1.6*3840*2 = 12.3TFlop

2080ti
>real clock after boost 4.0 or whatever
1.8ghz
>shaders
4352
1.8*4352*2 = 15.7Tflop
>>
This whole thing kinda reminds me of the Atari Jaguar. Quote from AVGN as it's the only explanation I've really seen.
>Well, we do know Atari was originally planning a 32-bit system called the panther but decided to skip it and leap ahead. The Jaguar still used the 32-bit graphics processing unit, but through a combination of other processors somehow added up to 64. It's technical and confusing, but the point is the Jaguar was a rare species not built like other game consoles. That made it harder for programmers to develop games on it, and as a result, many games didn't utilize its full capabilities, whatever they could have been.
>>
>>74788114
Just write good code. Drop in order and out of order paradigms. The architecture was strategically designed to be about streams instead as they predicted game engines would become streaming too. Also don't forget every SPU had it's own 256k and to top it of this: https://en.wikipedia.org/wiki/XDR_DRAM
>>
>Vega 10 XT AIR has almost identical floating point operations as TU102-300A-K1-A1
>>
>>74788573
ps3 is more reminiscent of sega saturn if anything
>>
>>74773908
Xbox360 release 2 years prior to ps3
>>
>>74789093
"good code" won't save your ass in things that require a lot of ifs, such as data compression, running scripts, path finding, AI, octree optimizations, geometry mangling...
For those you need a CPU with a small pipeline, so the stalls don't hurt as much.
And the cell have a pipeline larger than the fucking pentium 4 because they were dreaming of running it at 5Ghz but this plan busted.
A couple of PowerPC G3 cores in the cell would do it wonders.
>>
>>74789234
Those are the tasks you where supposed to do on the SPU's, not on the POWER cores.
I remember stalling myself back then because of race conditions and their solutions pointing out that either the POWER core was not able to direct the SPU's or a lack of bandwidth due to latency issues.
>>
>>74789377
Those things are BAD for the SPUs, they're like the worst case possible.
Zero floating point ops, 100% branch stall.
Things that run ifs for every bit in the data stream, that test a bunch of data against each other, things that are a huge list of ifs and only one of em is taken but you're supposed to do that millions of times per second.
There's nothing inside of the Cell CPU that can truck thru em.
>>
>>74789437
AI, Pathfinding, yes. data compression & geometry mangling: No.
>>
>>74773504
What games were you playing on an SGI?
>>
>>74770237
no it wasn't. especially in cities.
>>
>>74789537
How the fuck you implement huffman without ifs?
>>
>>74789877
256 sram is not going to cut it so either you split up and postpone the math or you use a more suitable although less efficient alternative for huffman. If lucky, the performance increase offsets the loss.
>>
>>74790110
The problem is not the memory, its the stalls.
You can't branch predict huffman, so half of the time your CPU will be clearing the pipeline, and this takes a long time on the Cell.
>>
>>74790184
The median expected output is reasonably predictable and timings are known. Masking, vectorizing and SIMD are cheap while you can schedule independantly of the SPE. But i have to admit i hardly got that solution to work due to reasons mentioned above.
>>
>>74790295
Which is why the cell needed to have some cores that have much shorter pipelines, like 8 stages.
Even if this part of the chip could clock at only 2Ghz, the developers would be able to toss the "rocky ifs" on those cores, while the SPEs handle the multiply flood tasks, like shader operations etc..
The combo would be great to decode mpeg/jpeg as well, because using one "short core" for the huffman and a "floatpoint core" to the YUV and IDCT would be just freaking fast.
>>
File: 1581201975736.jpg (460 KB, 1920x1080)
460 KB
460 KB JPG
If they had managed to hit the original designed specs it would have been a fucking monster.

Remember the whole thing was built with distributed computing and IoT in mind, not just high-end computing.

----

>Tue, Mar. 04, 2003
>With the PS 3, Sony will apparently put 72 processors on a single chip: eight PowerPC microprocessors, each of which controls eight auxiliary processors.

----

>As for PS3, long before the official specs were revealed, there were patents in late 2002 or early 2003 that showed a version of CELL with 4 PPEs and 32 SPUs
(these had different names at the time) on a single chip. This was called the 'Broadband Engine' (different than the final CELL Broadband Engine revealed in 2005).

>Right next to this was a CELL-based GPU with 4 "lighter" PPEs, 16 SPUs, pixel engines, embedded video memory, and probably other stuff. This was called the 'Visualizer'.

---

45nm Cell runs at only ~0.8V vcore at 3.2GHz

they hit 6GHz with 1.15V at 45nm

at 90nm they were able to hit 5.6GHz with 1.4V
>>
>>74790530
Yeah, but you're talking about expensive housefires with that kind of voltage and frequency along with how big on silicon at 90nm that chip would be.
>>
>>74773454
this was after the era where everything was just done in assembly and debug was more intense
>>
>>74774474
if you're jaded and nostalgic for the past this is how you see everything
>>
>>74773503
>Japan has a culture where you actually value and respect your work
laughs in suicide rate
>>
>>74783646
this
>>
>>74769234
the only thing it was good for is having a cool name
CELL BROADBAND ENGINE is a cool name
>>
>>74790530
damn cell was a beast
>>
What about the ps2 emotion engine? It was pretty similar to the cell design wise. The cell was more like a very pumped EE, before they scrapped the idea of two cells and had to go with that piece of shit of the rsx because they understood late that gpu offload was gonna be the future
>>
>>74791923
the EE didn't did graphics rendering either.
It was just about vertex math and sound mixing.
>>
>>74791923
>emotion engine
"the only emotion you're giving the developers is frustration"
https://www.youtube.com/watch?v=XxLLhwCx4cM
>>
>>74769234
It's more powerful in the sense that GPUs are more powerful than CPUs...

But you're not running real programs on GPUs, so the point is moot.
>>
>>74769428
The Linux and Mac pics are right on.
>>
>>74790530
>>74791333

They wanted to shove it all sorts of shit like DVDs/ Blu-ray players, TVs, etc.

It's basically a CPU that has tons of SIMD or floating point capability.
>>
File: ps2.png (70 KB, 850x665)
70 KB
70 KB PNG
>>74791923
>>
File: C3.png (1.45 MB, 1296x791)
1.45 MB
1.45 MB PNG
>>74779301
Crysis 3 only ran marginally better on 360 because all the textures were rendered incorrectly and thus running it at lower quality giving it 2fps more
Being a broken mess does not make it any smarter at all
>>
>>74785581
hahahaha omg
>>
>>74791923
>gpu offload was gonna be the future
gpu offloading had been the present for about 10 years by the time the ps3 came out



Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.