Интерполяция, аппроксимация и иже с ними (пакет alglib) - страница 13

 
Алексей Тарабанов:

Ну, дык там об этом и речь. Любой интерполяционный многочлен непригоден для экстраполяции. Фурье в точности повторяет исходный ряд, а полиномы вроде Лагранжа или Тейлора порождают кривые с лавинообразным ростом скорости изменения цены. Сглаживание смягчает картину, но не сильно, да и неправильно это - теряется связь с первоисточником. 

Есть простой, понятный и действенный метод экстраполяции, с интерполяцией никак не связанный. Тренд. 

вы как-то медленно от стресса после прочитанного отходите как и предыдущие читатели, здесь уже другая тема обсуждается

 
Maxim Dmitrievsky:

вы как-то медленно от стресса после прочитанного отходите как и предыдущие читатели, здесь уже другая тема обсуждается

Да, здесь уже off-topic. 

 

Hi Maxim,

Few days ago you were looking for kernel solutions for n input vectors instead of 2. Have you found that solution or trying to implement some other way?

If I am not wrong, then instead of K (x, y) where K is the kernel function, you are looking for the output of K (x1, x2, x3, ..., xn). Am I correct in understanding?

What I have learned is that the kernel of the function is the scalar value. So it should be the sum of all the dot products. It should be something like this:

K (x1, x2, x3, ... xn) = Sum of all z (i). Z (i + 1) for all i where 0 <i < n

It can be for a loop in MQL5 with the sum of all functions of the kernel function.

I have no way to test it. But have you tried and tested something similar? Or am I missing something here in understanding?

 
FxTrader562:

Hi Maxim,

Few days ago you were looking for kernel solutions for n input vectors instead of 2. Have you found that solution or trying to implement some other way?

If I am not wrong, then instead of K (x, y) where K is the kernel function, you are looking for the output of K (x1, x2, x3, ..., xn). Am I correct in understanding?

What I have learned is that the kernel of the function is the scalar value. So it should be the sum of all the dot products. It should be something like this:

K (x1, x2, x3, ... xn) = Sum of all z (i). Z (i + 1) for all i where 0 <i < n

It can be for a loop in MQL5 with the sum of all functions of the kernel function.

I have no way to test it. But have you tried and tested something similar? Or am I missing something here in understanding?

Hi, I actually dont know how to do this now, because those algoritms (like SVM or gaussian process) works only with inner products, not with a feature mapping. I'm seeking now for a good ideas how to do better

 
Maxim Dmitrievsky :

Hi, I actually do not know how to do this now, because those algoritms (like SVM or gaussian process) works only with inner products, not with a feature mapping. I'm seeking now for a good idea how to do better

As per my understanding kernel trick is a subset of the SVM algorithm and so you mean you are no more looking to implement kernel trick? 

What you call as feature mapping is expressed in terms of dot product or inner product of the higher space polynomials in kernel trick and so in my understanding it is just a simple multiplication of the kernel functions.

To make it clear, in K(x,y) are you planning to use the candle close price of two consecutive candles as x and y to get the kernel Or you are trying to implement something else?

 
FxTrader562:

As per my understanding kernel trick is a subset of the SVM algorithm and so you mean you are no more looking to implement kernel trick? 

What you call as feature mapping is expressed in terms of dot product or inner product of the higher space polynomials in kernel trick and so in my understanding it is just a simple multiplication of the kernel functions.

To make it clear, in K(x,y) are you planning to use the candle close price of two consecutive candles as x and y to get the kernel Or you are trying to implement something else?

I mean I dont understand how to change input vectors after multiplication, they are will absolutely equal then. It says need to use Gram matrix to place vectors (feature mapping), and then some manipulations with it. Here is sample code with SVM

https://pythonprogramming.net/soft-margin-kernel-cvxopt-svm-machine-learning-tutorial/

def polynomial_kernel(x, y, p=3):
    return (1 + np.dot(x, y)) ** p

# Gram matrix
        K = np.zeros((n_samples, n_samples))
        for i in range(n_samples):
            for j in range(n_samples):
                K[i,j] = self.kernel(X[i], X[j]) ??? Gram matrix 'K' is simmetrical. What we must do next, dont understand the code below

        P = cvxopt.matrix(np.outer(y,y) * K)
        q = cvxopt.matrix(np.ones(n_samples) * -1)
        A = cvxopt.matrix(y, (1,n_samples))
        b = cvxopt.matrix(0.0)

Now I just learning about vector spaces to understand it

maybe better if we go to en forum )

Python Programming Tutorials
  • pythonprogramming.net
Python Programming tutorials from beginner to advanced on a massive variety of topics. All video and text tutorials are free.
 
Maxim Dmitrievsky:

I mean I dont understand how to change input vectors after multiplication, they are will absolutely equal then. It says need to use Gram matrix to place vectors (feature mapping), and then some manipulations with it. Here is sample code with SVM

https://pythonprogramming.net/soft-margin-kernel-cvxopt-svm-machine-learning-tutorial/

Now I just learning about vector spaces to understand it

maybe better if we go to en forum )

Of course, the reference material is given in the other forums where the Gram matrix is solved in the video. I am trying to understand it also. 

Also, have you already understood and implemented till now in MQL5? Otherwise, there is no point trying further:)

 
FxTrader562:

Of course, the reference material is given in the other forums where the Gram matrix is solved in the video. I am trying to understand it also. Here is just another quick video reference specific to gram matrix:

https://www.youtube.com/watch?v=8JiMUqbByGA

Also, have you already understood and implemented till now in MQL5? Otherwise, there is no point trying further:)

Its a simple loop which calculates gram matrix.. but then working quadratic solver, i'm not sure for what.. or its just SVM logic already :)

thanks for video

 
Maxim Dmitrievsky:

Its a simple loop which calculates gram matrix.. but then working quadratic solver, i'm not sure for what.. or its just SVM logic already :)

thanks for video

Exactly..as I said it can be implemented probably just with a for loop in MQL5.

Well, we don't need to bother about other stuff as long as our end goal is achieved:)

I mean as long as we can take the inputs in Mql5 and get the outputs as kernels as expected, then other stuff doesn't matter. Because anyway the final part will be testing part where everything will be revealed if it has been implemented correctly or not based on the results. 

By the way, SVM is just a classifier technique and kernel trick makes it easy due to simple dot product. I don't think everything of SVM need to be implemented in kernel trick, because in kernel trick everything is done by the function itself and hence, nothing to do much.

Also, this video explains SVM in details along with sample code in python using kernel trick. You can have a look:

https://www.youtube.com/watch?v=N1vOgolbjSc&amp;t=157s 

 
FxTrader562:

But I dont understand how to work with Gram matrix now, because this is not a new transformed features, its just matrix with scalar product of old features

Причина обращения: