Making a Python trading system for MT. - page 5

 

Yesterday I was suddenly fascinated by theSQLite database. There is a ready-made Python interface for it -https://docs.python.org/3.6/library/sqlite3.html.

As a result, until late at night I read documentation on both the database itself and the Python interface. To find out some of the confusion, I bungled a couple of completely useless programs for SQLite. In general, fascinated. Wasted before neglected. What I've learned for the first time is that SQLite database can be organized in memory - with one command: conn = sqlite3.connect(":memory:") - this is in Python. That's all it takes. Then in a few queries, create tables, and work.

I used to use MS SQL Server or BD Access for the same purpose, and organized database in memory through DataTable tables of NET libraries. And here, one 1.5 MB long DLL and one command,))), and you don't have to do anything.

Now, what is it for? It's not necessary, it's necessary. While your system is counting something there, new data comes from the exchange - the market, deals, quotes, and you shove them into the database in memory. And when the system needs new data, it is all here, at hand, in memory. You pick it up and use it. You don't have to wait for any ticks and stuff. In addition, when something happens, you can always take your mind off the calculations and handle it. Again, all information is already there, in your memory. Of course, you need user events and threads, but it's not a problem anywhere.

Well, then, information from the memory database is saved to the main database in the background.

Oh, and SQLite, though small, supports multi-user read/write access. And in general, almost everything is like adults - there are some limitations, but in most cases, they are either insignificant or do not interfere at all.

 

Started doing SQLite database, it's not bad to do something useful. Let's start by importing a CSV file into the database.

Here is a snippet of the file itself, about 55 thousand lines in total.

<TICKER>,<PER>,<DATE>,<TIME>,<OPEN>,<HIGH>,<LOW>,<CLOSE>,<VOL>
SPFB.SBRF-12.17,1,01/09/17,10:01:00,18828.0000000,18838.0000000,18792.0000000,18838.0000000,25
SPFB.SBRF-12.17,1,01/09/17,10:02:00,18848.0000000,18858.0000000,18848.0000000,18858.0000000,2
SPFB.SBRF-12.17,1,01/09/17,10:03:00,18828.0000000,18828.0000000,18828.0000000,18828.0000000,2
SPFB.SBRF-12.17,1,01/09/17,10:04:00,18868.0000000,18868.0000000,18868.0000000,18868.0000000,2
SPFB.SBRF-12.17,1,01/09/17,10:05:00,18865.0000000,18879.0000000,18856.0000000,18879.0000000,10
SPFB.SBRF-12.17,1,01/09/17,10:06:00,18841.0000000,18851.0000000,18841.0000000,18851.0000000,3
SPFB.SBRF-12.17,1,01/09/17,10:07:00,18864.0000000,18864.0000000,18864.0000000,18864.0000000,1

We described import of CSV file into Python earlier, we won't stop here. As a result, we have a CSV title string and a data string.

Now the Python program itself:

# строка заголовка CSV
title=csv.title
# строки данных CSV
data=csv.InData
# подсоединяемся к БД. Если БД отсутствует, создается новая с эти именем - наш случай.
con=sql.connect('csvdb.db')
# создаем курсор
cur=con.cursor()
# выполняем запрос создания таблицы SBER и ее полей (для упрощения все поля текстовые)
cur.execute("""create table SBER ('%s' text, '%s' text,
                                  '%s' text, '%s' text, '%s' text,
                                   '%s' text, '%s' text, '%s' text,
                                    '%s' text)""" % tuple(title))

#заполняем таблицу данными из CSV строк
cur.executemany("insert into SBER values (?,?,?,?,?,?,?,?,?)", data)
#Записываем данные в таблицу БД
con.commit()
#если БД больше не нужна, закрываем соединение.
con.close()

We look at the SBER table in SQLiteStudio:

Yes, all the fun is 3-4s from starting the program, including opening and reading the CSV file. By the way, the size of the database file is smaller than a similar CSV file. The whole database is 4.3 MB, CSV size is 5.2 MB.

Well, the rest is done in the same way)).

 

Let's continue our unpopular topic with the natives.

There are regular threads on the forum about reading data from CSV files. In the last week alone there have already been two - we needed to read the last to last couple of lines.

Last time we wrote data from a CSV file into the database, today we are going to solve the problem of reading rows from the database.

There are about 55 thousand rows in our table in the database. Let's try to read not even the last one, but some 27568th one. To do this let's write an SQL query and send it to the database:

#отметка времени начала 
Tb=dt.datetime.today()
#готовим и передаем в БД SQL запрос
cur.execute('select * from SBER where id=27568')
#читаем ответ, одну строку
s=cur.fetchone()
#отметка времени конца
Te=dt.datetime.today()
#все распечатываем.
print('Tb=' + str(Tb) +', Te=' + str(Te))
print('Te-Tb='+str(Te-Tb))
print(s)

That's it. We've got the result:

Tb=2018-10-05 21:53:11.185492, Te=2018-10-05 21:53:11.188496
Te-Tb=0:00:00.003004
(27568, 'SPFB.SBRF-12.17', '1', '27/10/17', '10:55:00', '19692.0', '19708.0', '19691.0', '19704.0', '1123')

Execution time ~0.003004 sec.

Let's make the task more difficult. Find lines, where Close=22730.0000000

Everything is the same but the query looks different.

#это значение точно есть в строке №55420

cur.execute('select * from SBER where CLOSE=22730.0000000')
# читаем все строки, возвращаемые запросом.
s=cur.fetchall()

The result:

Tb=2018-10-05 22:03:09.797261, Te=2018-10-05 22:03:09.859763
Te-Tb=0:00:00.062502
[(38393, 'SPFB.SBRF-12.17', '1', '16/11/17', '18:16:00', '22708.0', '22734.0', '22707.0', '22730.0', '673'),
 (38932, 'SPFB.SBRF-12.17', '1', '17/11/17', '13:47:00', '22713.0', '22730.0', '22707.0', '22730.0', '194'), (38962,
 'SPFB.SBRF-12.17', '1', '17/11/17', '14:22:00', '22725.0', '22736.0', '22725.0', '22730.0', '362'),
 ...
 ...
 ...
 (55288, 'SPFB.SBRF-12.17', '1', '15/12/17', '21:18:00', '22723.0', '22732.0', '22723.0', '22730.0', '87'), 
(55289, 'SPFB.SBRF-12.17', '1', '15/12/17', '21:19:00', '22727.0', '22730.0', '22726.0', '22730.0', '17'),
 (55382, 'SPFB.SBRF-12.17', '1', '15/12/17', '23:05:00', '22731.0', '22731.0', '22730.0', '22730.0', '322'),
 (55420, 'SPFB.SBRF-12.17', '1', '15/12/17', '23:48:00', '22735.0', '22741.0', '22727.0', '22730.0', '315')]

20 lines were returned. Execution time ~0.0625c.

Finally, let's find the highest price in the column (field) CLOSE

cur.execute('select max(CLOSE) from SBER')

B response:

Tb=2018-10-05 22:11:28.979472, Te=2018-10-05 22:11:29.014472
Te-Tb=0:00:00.035000
('23540.0',) //это самая высокая цена.

Well, and an execution time of -0.035s

As you see, everything is very simple and fast. If I were a CSV-files consumer, I would think, why wouldn't I write a DL with 5-6 functions under MT, and use DB instead of CSV-files.

PS The CSV file on which the database was manipulated is in the archive attached a few posts ago.
 

Yeah, life is not easy for MQL4-5 developers. Googled whether there are solutions for connecting MT to SQLite. The SQL5.com article SQL AND MQL5: WORKING WITH SQLITE DATABASE directly from MQ and some third-party materials. The SQLite database is overcomplicated, and the programs are not that simple, unlike similar ones in Python.

In fact, to work with SQLite at first time it's enough to import only 7-8 high-level SQLite functions into MQL. The rest, for the most part, if it's needed, will be needed in the distant future. This work can be done by a C++ programmer in the worst case in a couple of days. But you can try ready-made solutions as well.

PS Have a closer look at the articleSQL AND MQL5: WORKING WITH SQLITE DATABASE. Actually, maybe it will work. It only uses x86 DLL there to prepare x64 DLL for MT. The article is from 2014, after all. But, if I'm not mistaken, there's already a x64 SQLite DLL. You can try to compile with it.

 

Yuriy Asaulenko:

Yeah, it's not an easy life for MQL4-5 developers...... I think it's overcomplicated, and programs, unlike similar ones in Python, are not simple.

Imho, in Python the problems are no less: you need a history, you need a bar chart, you need technical indicators... I would also think about a tester and trading too....

Well, as the saying goes: all felt-tip pens are different for every taste!

 
Igor Makanu:

imho you have just as many problems with Python: you need a history, you need a bar chart, you need technical indicators... I would also think about a tester and trading....

Well, as the saying goes: all felt-tip pens are different for every taste!

History is not an issue and so is pumping it up. Technical indicators are not an issue, moreover I only have my own indicators. If you need them, there are simple algorithms there.

I have a bar drawing tool in Python. Do I really need it for my autosystem? - I am not writing a terminal.)

And it's not the first such system, there were earlier on VB Excel, and on C++, and on C#. Actually Python is chosen because of extensiveness of libraries and not desire to multiply zoo out of mixture of different languages and interfaces to get an access to these libraries. Also, Python allows you to do modeling - that's what it's designed for, and it's very important in system development. Let's say it's a direct competitor to MathLab).

 
Yuriy Asaulenko:

Question.

Is it possible to exchange mql-kim and python kanvas through a link?

It's easy to do in C++, but I don't know in python.

 
Nikolai Semko:

Question.

Is it possible to exchange mql-kim and python kanvas through a link?

It's easy to do in C++, but I don't know in python.

I don't.

 

Good news))) -BetaMetaTrader 5 build 1910: Free Drag & Drop of Charts and .Net Library in MQL5 . Now, when the beta is working, you don't need to write anything to connect to SQLite ( as well as many other libraries). You won't need will be DLL's.

SQLite Net-libraries are already available -http://system.data.sqlite.org/index.html/doc/trunk/www/index.wiki

Now (or rather later, when the working version of MT appears) we can forget about MT's communication with the trading system via file exchange and communicate via database. Earlier in this thread it was shown that it's very fast and more than enough for most systems.

System.Data.SQLite: Home
  • system.data.sqlite.org
For the most recent news and developments, please see the news page. About System.Data.SQLite System.Data.SQLite is an ADO.NET provider for SQLite. System.Data.SQLite was started by Robert Simpson. Robert still has commit privileges on this repository but is no longer an active contributor. Development and maintenance work is now mostly...
 
Yuriy Asaulenko:

Good news.)) -BetaMetaTrader 5 build 1910: Free Drag & Drop of Charts and .Net Libraries in MQL5 . Now, when the beta will be working, you don't need to write anything to connect to SQLite ( as well as many other libraries). You won't need will be DLL's.

SQLite Net-libraries are already available -http://system.data.sqlite.org/index.html/doc/trunk/www/index.wiki

Now (or rather later, when the working version of MT appears) we can forget about MT's communication with the trading system via file exchange and communicate via database. Earlier in this thread it was shown that this is very fast and more than enough for most systems.

You misunderstood the news.

Support for .NET libraries does not mean that security controls on DLLs are disabled. DLL controls have always worked and will always work.

We just let you load .NET without any tinkering. Or rather, we took all the messy and non-trivial work of communicating with the .NET virtual machine and hid all the inconveniences.


However, we have long been planning to add SQlite support in MQL5. Most likely we will.
Reason: