Global recession over the end of Moore's Law - page 4

 
СанСаныч Фоменко:

You and I are talking about different things.

We're past that.

Why different, it's the basis of analogue and digital.
 
Alexey Busygin:
Why different, it's the basis of analogue and digital.

What's that got to do with the basics...

I'm strictly in the topic: digital with its processors is far from everything in computing, Ideologically. And the technical differences are another matter. That's what I mean.

 
From such bullshit, such global conclusions
 
СанСаныч Фоменко:

What's that got to do with the basics...

I'm strictly in the topic: digital with its processors is far from everything in computing, Ideologically. And the technical differences are another matter. That's what I mean.

You decided to compare a 70-80s processor with a modern one and which one is faster. Well, the old computers had a clock speed of 24MHz, now the average clock speed is 2.2GHz, so who is faster.
 
Alexey Busygin:
You decided to compare a 70-80s processor with a modern one and which one is faster, so the old computers had a clock speed of 24MHz, now the average clock speed is 2.2GHz, so who is faster.

It seems to me that you can't measure by frequency.

1. We are consuming operations. BESM was for scientific calculations and the word length was 45 (or 48, can't remember the bits).

2. The exchange in BESM was words. Today 64 bits is the last peep.

3. A modern PC is a computer with the most miserable architecture: a common bus. BESM was not such.

Therefore it is difficult to compare.

To compare in any way, then compare the number of long floats.

 
СанСаныч Фоменко:

It seems to me that you can't measure by frequency.

1. We are consuming operations. BESM was for scientific calculations and the word length was 45 (or 48, can't remember the bits).

2. The exchange in BESM was words. Today 64 bits is the last peep.

3. A modern PC is a computer with the most miserable architecture: a common bus. BESM was not such.

Therefore it is difficult to compare.

To make a comparison, compare the number of long floats.

In terms of bits and stuff, the old is still inferior to the new. 64 bits is the line buffer bandwidth. PCs like computers work in machine code. The only thing that hasn't changed since then is the price of components. An old computer is still as expensive to build as it was back then.
 
Are you comparing a modern PC to a computer from the 60s when the RAM was in KB and the frequency was in MG?
 
Dmitry Fedoseev:
Are you comparing a modern PC to a 60's PC when the RAM was in KB and the frequency was in MG?
Well yes, that's exactly what we're comparing W to P.
 
Dmitry Fedoseev:
Are you comparing a modern PC to a 60's PC when the RAM was in KB and the frequency was in MG?

Well, yes.

There is no comparison by the measure that applies to BESM - floating point operations.

 
Alexander Laur:

In terms of the microprocessor element base, there will be two paths: the first is to replace the material with one more suitable for the higher power tasks, and the second is to change the microprocessor architecture. This process will continue indefinitely. But in my opinion the progress of the nearest future will be connected not with the element base but with the progress of software technologies. Why? Because there are already so many processors made today, with so much total power, that it takes time for humanity to figure out how to use that power efficiently. And the efficient use of that power lies in the development of new software technologies.

Recently I read a news that MSU physicists created an algorithm for data processing using video card kernels on a PC, which is 100 times faster than supercomputer processing. The author cites the fact that German colleagues had to process data on a supercomputer for 3 days, and ours with our algorithm took 15 minutes on a PC (!!!). Feel the difference. And the speed of optimization when the MC cloud is connected can you compare with the speed of optimization on your PC, even the super cool and powerful one.

So the future is in software technology! There's no room for creativity here! :)

The first GPU with a significant number of cores appeared in 2007 (Nvidia, 128 cores). Today, the number of cores has grown to 3,000. Did you experience a 128x speedup of your PC in 2007 versus 2006? How about 3000x acceleration today compared to 2006? There isn't one. Cores continue to be used in graphics where paralleling is easy. In 2009-2010 I tried to program something on a GPU with 256 cores myself. I immediately found out why software mostly doesn't use multi-core - it is very difficult, the developer has to manually decide which parts of the program can be paralleled. Well, I did finish the new code, it was 3 times faster with 256 cores. But I was even pleased with the 3 times acceleration. But when I had to create the next code I recalled my agony and stopped parallelizing anymore. Of course, separate problems such as graphics and database handling will continue to use multi-core, but other programs will only be helped by a new compiler which automatically finds places in the program which can be paralleled, but no such compiler exists as far as I know.

I don't deny the potential of improving the software and making computers faster on that basis. I argue that the purchase of new computers and smartphones will fall in 2021-2022 due to the stalling of hardware improvements. Will you buy a new computer if it has the same cores, memory and frequency as your old computer? Probably not, you will buy new software. All hardware manufacturers and related industries will go into recession, with mass unemployment.

Reason: