Machine learning in trading: theory, models, practice and algo-trading - page 3381

 
Aleksey Nikolayev #:
Bring links, if not difficult (articles, books).

Forum on trading, automated trading systems and testing trading strategies

Useful literature.

Andrey Dik, 2010.07.24 22:26

Neural networks, genetic algorithms

A case study on using neural networks to perform technical.pdf
Christian L. Dunis Modelling and Trading EURUSD.pdf
ED PONSI Forex Patterns and Proabilities.pdf
Gorban A.N. Neuroinformatics. What are we, where are we going, how to measure our way.pdf
Haykin S. Kalman filtering and neural networks.djvu
Jonsson F. Markus. Finding the optimal path for vehicles on digitised maps of real terrain. Part 1.doc
Jonsson F. Markus. Finding the optimal path for vehicles on digitised maps of real terrain. Part 2.doc
Kondratenko V.V. Using Recurrent Neural Networks To Forecasting of Forex.pdf
Krose B. An introduction to Neural networks. 1996.djvu
Long, Parks Neural Network Modeling.djvu
Neural_Network_Trend_Predictor_Manual.pdf
Simon D. Training radial basis neural networks with the extended Kalman filter
ToshibaNeuronChip.pdf
Barskii A.B. Neural Networks Recognition, Control, Decision Making. 2004.pdf
Berkinblit M.B. Neural Networks 1993.djvu
Bastens D. Neural Networks and Financial Markets. Decision making in trading operations.djvu
Vapnik V.N. Restoration of dependencies on empirical data. 1997.djvu
Vezhnevets A. Non-Standard Neural Network Architectures.pdf
Voronovsky G.K. Genetic Algorithms, Artificial Neural Networks and Problems of Virtual Reality.pdf
Voronovsky G.K. Genetic Algorithms, Artificial Neural Networks and Problems of Virtual Reality.pdf
Galushkin A.I. Theory of Neural Networks. Book 1 2000.djvu
Goldstein B.S. Intellectual Networks. 2000.djvu
Gorban A.N. Generalised approximation theorem and deduction possibilities of neural networks.pdf
Gorbunova E.O. Algorithmic universality of kinetic machine Kirdin.pdf
Gorbunova E.O. Methods of neuroinformatics. Finiteness and determinacy of simple programmes for the Kirdin kinetic machine.pdf
Jain Anil K. Introduction to artificial neural networks.pdf
Dorrer M.G. Intuitive prediction by neural networks of relations in a group.pdf
Dorrer M.G. Methods of neuroinformatics. Approximation of multidimensional functions by a semi-layer predictor with arbitrary converters.pdf
Dubrovich V.I. Subbotin S.A. Algorithm of accelerated learning of perseptrons.pdf
Ezhov A. Shumsky S. Neurocomputing and its applications in economy and business.djvu
Zhukov L.A. Use of neural network technologies for carrying out educational and research works.pdf
Zaentsev I.V. Neural networks basic models. 1999.pdf
Zakharov V.N. Khoroshevskiy V.F. Artificial Intelligence. Volume 3. Software and hardware 1990.djvu
Callan R. Basic concepts of neural networks.djvu
Kgur P.G. Neural networks and neurocomputers.pdf
Komashinsky V.I. Neural networks and their application in control and communication systems 2003.pdf
Committee NN.pdf
Korotkiy S. Hopfield and Hamming neural networks.pdf
Korotkiy S. Neural networks. Backpropagation Algorithm.pdf
Korotkiy S. Neural Networks. Learning without a teacher.pdf
Korotkiy S. Neural Networks. Fundamentals.pdf
Krisilov V.A. Kondartiuk A.V. Transformation of input data of neural networks in order to improve distinguishability.pdf
Krisilov V.A. Oleshko D.N. Methods of acceleration of training of neural networks.doc
Krisilov V.A. Chumichkin K.V. Acceleration of training of neural networks due to adaptive simplification of training sample.pdf
Krislov V.A. Representation of initial data in tasks of neural network forecasting.pdf
Kruglov V.V. Fuzzy Logic and Artificial Neural Networks.djvu
Kruglov, Borisov - Artificial Neural Networks. Theory and Practice, 2002.djvu
Kruglov, Borisov - Artificial Neural Networks. Theory and practice, 2002.txt
Liu B. Theory and practice of undefined programming 2005.djvu
McCulloch W., Pitts W. Logical calculus of ideas relating to neural activity.pdf
Markel J.D. Linear prediction of speech. 1980.djvu
Mirkes E.M. Neurocomputer. Draft standard. 1998.pdf
Modified Genetic Algorithm for Optimisation Problems in Control.pdf
Nabhan T.N. Zomaya A. On problems of creation of neural network structures for optimisation of functioning.pdf
Napalkov A. В., Pragina L.L. - Human Brain and Artificial Intelligence.docx
Oleshko D.N. Improving the Quality and Speed of Neural Networks Training in the Task of Predicting the Behaviour of Time Series.doc
Oleshko D.N. Improving the Quality and Speed of Neural Networks Training.doc
Ostrovsky S. Neural Networks for Information Processing 2000.djvu
Pavlidis T. Algorithms of Machine Graphics and Image Processing.djvu
Penrose R. The New Mind of the King. On computers, thinking, and the laws of physics. 2003.djvu
Pitenko A.A. Use of neural network technologies in solving analytical tasks in GIS.pdf
Rutkovskaya D. Neural Networks Genetic Algorithms and Fuzzy Systems.djvu
Senashova M.Yu. Neural Networks Errors. Calculation of errors of synapse weights. 1998.pdf
Subbotin S.A. Neurocybernetics in USSR-CIS - Analytical review of inventions and patents.pdf
Tarasenko R.A. Choice of size of situation description at formation of training sample for neural networks in tasks of forecasting of time series.doc
Tarasenko R.A. Preliminary estimation of quality of training sample for neural networks in tasks of forecasting of time series.doc
Terekhov S.A. Technological aspects of training of neural network machines. 2006.pdf
feasibility-INVEST.gif
Tyumentsev Y.V. Intelligent Autonomous Systems - Challenge to Information Technologies.pdf
Wosserman F. Neurocomputer Engineering.doc
Wosserman F. Neurocomputer Engineering. Theory and practice.doc
Haykin S. Neural networks - full course.djvu
Tsaregorodtsev V.G. Production of semi-empirical knowledge from data tables by means of trained artificial neural networks.pdf


 
Maxim Dmitrievsky #:
The FF is the same, isn't it?

When a hundred sets are searched, yes. If one finds a hundred through FF1, a bad average set will not say there is no good set. Because a good average set may well be found through FF2.

 
Andrey Dik #:
The components can be evaluated separately in a multifunctional space or all together - meta-evaluations, or otherwise - integral evaluations.

References to works on this topic are interesting.

 
fxsaber #:

When looking for a hundred sets, yes. If one finds a hundred through FF1, a bad average set will not say that there is no good set. Because a good average set may well be found through FF2.

Well, that's right.
 
mytarmailS #:

1)

What is the contradiction ?

parameter selection == parameter search in the optimisation algorithm

model metric estimation == FF with akurashi estimation e.g.

What don't you agree with?

Read this, especially the section "Loss function != quality metric". I don't think I could make it any clearer.

mytarmailS #:

2)

Can you elaborate on what you see as the problem? For example, I don't see

It leads to potential unboundedness of the number of parameters, since function spaces are infinite dimensional. In practice it leads to the necessity to control the number of parameters somehow - for trees it is pruning of leaves, for example.
Метрики классификации и регрессии
Метрики классификации и регрессии
  • education.yandex.ru
Как оценить качество модели для классификации или регрессии и почему для разных задач нужны разные метрики
 
Valeriy Yastremskiy #:

Links to works on this topic are interesting.

I can't give a specific reference, unfortunately. Above I gave a list of literature, I will have to search for it myself, if someone is interested. I will no longer be engaged in such educational activities, maintenance of the library of books and their cataloguing - it is not appreciated and it does not bring money.

"Don't sweat before swine..."

 
Andrey Dik #:

I can't give you a specific reference, unfortunately. Above I gave a list of literature, I will have to search for it myself, if anyone is interested. I will no longer be engaged in such educational activities, maintenance of the library of books and their cataloguing - it is not appreciated and it does not bring money.

"Don't sweat before swine..."

I would like a link to specific algorithms on multi-criteria optimisation in function space. But if not ready to provide, it's better to keep it quiet for clarity) Not ready to waste time searching for it.
 
Valeriy Yastremskiy #:

Links to works on this topic are interesting.

There's a certain slyness to it. Links to make sure they open. No one who is "interested" will delve into them. No one will read Andrei's chewed-up articles, let alone works of academic nature.


Has anyone seen this easy-to-understand TOP with the ability to calculate the ranking of their own optimisation algorithm?

Forum on trading, automated trading systems and testing trading strategies

Machine Learning in Trading: Theory, Models, Practice and Algorithm Trading

mytarmailS, 2024.01.11 10:29 AM

And also no one is not confused as an optimization algorithm top-3 or even top-1 in the world, universally recognised and well-known such as PSO, he has at the end of the rating, and some know-names about which no one has ever heard of such as grey wolves, weeds, etc. He has the leaders))))

https://habr.com/ru/users/belyalova/publications/articles/

Статьи / Профиль belyalova
Статьи / Профиль belyalova
  • 2021.10.12
  • habr.com
Еще раз здравствуй, Хабр! Меня зовут Мария Белялова, и я занимаюсь data science в мобильном фоторедакторе Prequel. Кстати, именно в нём и обработана фотография из шапки поста. Эта вторая статья в нашем цикле материалов про сравнение алгоритмов оптимизации для обучения нейросетей. В первой части мы сравнивали поведение 39 алгоритмов на тестовых...
 
Andrey Dik #:
maintaining a library of books and cataloguing them

Not every useful thing brings money)))))

Reason: