Global recession over the end of Moore's Law - page 6

 
Nikolay Demko:

So in the future there will be building compute farms which will be dedicated to users and desktops and handhelds will be just the terminals for communicating with the mainframe.

Hence the moral: Intel will be busier than ever because even those processors that are available will be needed more and more and I don't foresee any stagnation in this industry (imho).

About the growth of cloud computing I agree. But it is not clear how it will save laptops and smartphones. They will turn into Internet connection terminals. Their improvement will depend on the speed of data transfer. The first generation of smartphones was capable of 1-2mbps, the second at 10mbps, the 3rd at 54mbps, today's 4th at 600mbps, the coming 5th will be capable of 5gbs using millimetre waves. The first tablets and backgrounds with this capability will be demonstrated at the 2018 Winter Olympics in Seoul. Mass production will be in 2020. After that, increased data speeds will have no tangible effect. And increasing frequencies beyond millimetres becomes impractical because of the decreasing distance (less than 10 metres). This is how laptops and backgrounds turned into cloud connectivity terminals also have a developmental limit in 2020, as does the silicon microprocessor technology. however much their sales will fall dramatically starting in 2020. Computing centres will grow, but background screen makers for example will not care about this. Apple will also be hit hard because their business is consumer oriented, not server oriented.
 
Vladimir:

The power consumption of the logic circuit element is calculated using the formula:

P = f*C*V^2

where f is the frequency, C is the load capacitance (input capacitance of the next element + capacitance of the metal connection), V is the supply voltage. Frequency has stopped increasing in the last 5-10 years, 2-3GHz. Smaller transistor size led to lower load capacitance (lower input transistor capacitance and shorter connections between transistors) and lower supply voltage. When I started in the industry the supply voltage was 5V, then 3V, 1.5V, 1V and now 0.5V. Every new generation of silicon technology now leads to a voltage reduction of 0.1-0.2V. When Moore's law stops, the power stops decreasing and the number of cores stops growing.

Few people are aware of the fact that all integrated circuit technology was developed by Intel. Every company in the world is copying Intel. They invented FinFET 10 years ago and spent all these 10 years to put it into production. My friends at Intel tell me they don't have any more ideas. Our company is funding research at various universities but so far there is nothing. The world is on the verge of some pretty dire consequences of the end of Moore's law. In difficult economic times, world wars usually occur, leading to a surge of investment by states in new technologies and the subsequent development of those technologies for peaceful purposes. This was the case during WWII - Alan Turing invented the computer to decode German military messages. 25-30 years ago, as a consequence of the computer revolution, there was a need to network computers and the internet was born. In the last 10 years, the Internet has essentially changed little. Today smartphones can connect to the Internet at almost the same speed as a home computer. I can't imagine what new technology will take the place of computers and the internet and allow worldwide economic growth to continue.

Yeah, that's what I was talking about )
 
Alexey Busygin:
Can you recommend a CPU with 256 cores
NVidia latest > 2000 cores
 
Vladimir:
The limit of development in 2020

How can you be so sure that if you can't see a prospect, there isn't one?

I admit that it is possible to be a good expert in one industry and have a good idea of what awaits it on the horizon of 2-5 years (although in the technology sector this is too distant horizon, imho).
But one cannot be aware of all research in all related fields, can one?

If a ceiling of gigahertz or gigabits/sec is reached, then some alternative (surely an order of magnitude more powerful) will be found and the world will continue to evolve. Unless, of course, that development is necessary.

It is like arguing about hydropower production and being upset that in 5 years all the rivers will be used as efficiently as possible, but not seeing that there are plenty of much more powerful alternatives, from heat pumps to nuclear energy.

Why the melancholy? )

 
Optics rules a former Russian scientist in America has already managed to retain information in a crystal for more than 0.5 seconds without power
 
Alexey Volchanskiy:
NVidia latest > 2000 cores
I can't find any CPUs, only cards
 
Alexey Busygin:
I can't find any processors from them, only cards

The card is the processor. You write code in Visual Studio 10 in CUDA C language, compile it on the GPU and run it. Writing code for the GPU is much harder than for the CPU. You need to add commands for memory allocation on the GPU (there is usually not much of it), transferring data from the CPU to the GPU memory, special paralleling commands, then rewriting the data back, freeing memory, etc. A lot of different subtleties, but you get to use 3000 cores. See here

https://developer.nvidia.com/how-to-cuda-c-cpp

GPU Accelerated Computing with C and C++
GPU Accelerated Computing with C and C++
  • developer.nvidia.com
With the CUDA Toolkit from NVIDIA, you can accelerate your C or C++ code by moving the computationally intensive portions of your code to an NVIDIA GPU.  In addition to providing drop-in library acceleration, you are able to efficiently access the massive parallel power of a GPU with a few new syntactic elements and calling functions from the...
 
Vladimir:

The card is the processor. You write code in Visual Studio 10 in CUDA C language, compile it on the GPU and run it. Writing code for the GPU is much harder than for the CPU. You need to add commands for memory allocation on the GPU (there is usually not much of it), transferring data from the CPU to the GPU memory, special paralleling commands, then rewriting the data back, memory release, etc. A lot of different subtleties, but you get to use 3000 cores. See here

https://developer.nvidia.com/how-to-cuda-c-cpp

I asked about the processor, not the expansion card as their installation slots are different.
 
Andrey Khatimlianskii:

How can you be so sure that if you can't see the prospect, there isn't one?

I admit that you can be a good expert in one industry and have a good idea of what's in store for it in the horizon of 2-5 years (although in the technology sector that's too far off a horizon, imho).
But one cannot be aware of all research in all related fields, can one?

If a ceiling of gigahertz or gigabits/sec is reached, then some alternative (surely an order of magnitude more powerful) will be found and the world will continue to evolve. Unless, of course, that development is necessary.

It is like arguing about hydropower production and being upset that in 5 years all the rivers will be used as efficiently as possible, but not seeing that there are plenty of much more powerful alternatives, from heat pumps to nuclear energy.

Why the melancholy? )

There is no melancholy, there is fear for the future, both my own and others. It is certainly so easy to live when you trust in scientists that they will find a solution to a problem, a new technology, a cure for cancer or a solution to global warming. The end of Moore's law is quite relevant. Read recent articles on the subject. My view may be pessimistic, but it is based on a deep knowledge of semiconductor technology and the latest research in the field by virtue of my specialty. It takes about 10 years to bring a new technology to mass production, and so far no such technology has appeared in the laboratories of companies or universities. So I expect 5-10 years of stagnation, maybe even longer, in computer technology. There is a worldwide organization ITRS (International Technology Roadmap for Semiconductors) which consists of employees of major semiconductor companies and publishes the semiconductor roadmap for the near future (their vision of where the technology is going). They have been publishing this roadmap since 1965, every two years. The last one was the 2014 report. The next report should be published this summer. Everyone in the field was looking forward to this report, but it never came out and the organization was renamed the International Roadmap of Devices and Systems (IRDS) and subordinated to the IEEE. This new organization will publish a roadmap of computer and communication systems, software, etc. What will be included in this report is pretty vague.

http://www.eetimes.com/document.asp?doc_id=1329604

EE Times | Electronic Engineering Times | Connecting the Global Electronics Community
EE Times | Electronic Engineering Times | Connecting the Global Electronics Community
  • www.eetimes.com
EE Times connects the global electronics community through news, analysis, education, and peer-to-peer discussion around technology, business, products and design
 
Alexey Busygin:
I asked about the processor, not the expansion card they have different mounting slots.

GPU = graphics processing unit (produced mainly by Nvidia)

CPU = central processing unit (made by Intel or AMD)

Both are processors. Don't you get it? Call GPU a card or whatever you want but it is a processor with 3000 cores if you have the latest model. If you have a computer it's also a GPU, read the documentation what model you have and how many cores it has.

Reason: