Machine learning in trading: theory, models, practice and algo-trading - page 3621

[Deleted]  
mytarmailS #:

1) try different MOs (e.g. in my competition wooden MOs didn't work at all, only functional MOs worked: SVM, neuronics, ruler).

2) do not submit all features in a pile, but look for the best subset of features, in practice up to 3-7 pieces can be used.


you can get 0.8 on 1000 features and 0.1 on 5 features.

I selected the features. On different amounts of data it selects different ones, plus the importance of different combinations floats. So everything is rubbish :)

I did recursive selection, it doesn't find it either.
The error does not fall on the shaft. As if there are no links in the dataset. But there should be :)

This is exactly the situation of desperation, when people start torturing kozul.
 
Maxim Dmitrievsky #:
I selected the features. On different amounts of data it selects different ones, plus with different combinations the importance floats. So it's all rubbish :)

I did recursive selection, it doesn't find it either.
The error does not fall on the shaft.

Try bootstrap

[Deleted]  
mytarmailS #:

try bootstrap.

Then I may accidentally get into validation, but I won't be able to enter the test.
 
Maxim Dmitrievsky #:
I've been picking out the signs.

Try the rules from the wooden MO to pick the best ones, like I did a long time ago, remember?

[Deleted]  
mytarmailS #:

Try the rules from the wooden MO to pick the best ones, like I did a long time ago, remember?

I remember something like that. Let's see, closer to the weekend I'll be in for 24 hours again, I'll check a lot of things. Maybe I'll have an epiphany
[Deleted]  

The same features do not make sense to take from the ceiling, because there are specialised algorithms:


1. Frequent Subgraph Mining (Frequent Subgraph Mining):

These algorithms look for subgraphs that occur frequently in a set of graphs. Popular algorithms include:

- gSpan

- FSG (Frequent Subgraph Discovery)

- FFSM (Fast Frequent Subgraph Mining).


2. Graph Similarity Search (Graph Similarity Search):

These methods search for graphs that are similar to each other in a set. Various graph similarity measures are used such as:

- Editorial distance of graphs

- Maximum common subgraph isomorphic correspondence

- Nuclear methods for graphs


3- Anomaly Detection in graphs:

These algorithms look for unusual or anomalous structures in a set of graphs:

- Density based algorithms

- Random walk based methods

- Spectral methods


4. Classification and clustering of graphs:

These methods group similar graphs or classify them into given categories:

- Graph kernels

- Graph neural networks

- Spectral clustering of graphs


5. Motif detection in graphs:

These algorithms look for recurring structural patterns (motifs) in graphs:

- FANMOD

- NeMoFinder

- MODA


6. Analysis of graph evolution:

These methods study how graphs change over time:

- Algorithms for detecting changes in dynamic graphs

- Predicting graph evolution

 
Are these libraries real libraries, or are they the hallucinations of a thug?
[Deleted]  
mytarmailS #:
Are these libraries real or are they the hallucinations of a bunch of hooligans?

They're real, I' ve used them

Prado uses them in their examples. I don't know about the others.

Software for Complex Networks#
  • networkx.org
Software for Complex Networks# Release Date NetworkX is a Python package for the creation, manipulation, and study of the structure, dynamics, and functions of complex networks. It provides: With NetworkX you can load and store networks in standard and nonstandard data formats, generate many types of random and classic networks, analyze network...
[Deleted]  

These are the graphs in his dataset

Only with the difference that they have two nodes X and Y and intermediate nodes

DAG - Topological Layout#
  • networkx.org
DAG - Topological Layout# This example combines the generator with to show how to visualize a DAG in topologically-sorted order. Total running time of the script: (0 minutes 0.107 seconds)
[Deleted]  
There are also graph neurons, often based on convolutional layers. I was teaching one like this yesterday.