Log shipping is the process of automating the backup of a database and transaction log files on a primary (production) database server, and then restoring them onto a standby server.
The log shipping is a technique to replicate database to another database instance by copying and reload a transaction log file.
In the case of EPPO, the process of transferring the transaction log files is made through webservices.
As the process result there are two copies of the data on two separate locations.
A log shipping session involves the following steps:
Initialization for PostgreSQL Database
# createdb -h 127.0.0.1 -E UTF8 -U postgres DBEPPOreplica
# psql -h 127.0.0.1 -U postgres --single-transaction -q DBEPPOreplica < pg_structure.sql
# psql -h 127.0.0.1 -U postgres --single-transaction -q DBEPPOreplica < init_logshipping.sql
Initialization for MySQL Database
# mysqladmin --user=root --password=mypassword create dbepporeplica
# mysql dbepporeplica --user=root --password= mypassword --batch --default-character-set=UTF8 < mysql_structure.sql
# mysql dbepporeplica --user=root --password= mypassword --batch --default-character-set=UTF8 < init_logshipping.sql
Configure script
Into script you have to change 2 parameters
define('DB_DSN_PG','pgsql:dbname=DBEPPOreplica;user=dbuser;password=yourpasswordhere;host=127.0.0.1');
or
define('DB_DSN_MY','mysql:dbname=dbepporeplica;username=dbuser;password=yourpasswordhere;host=127.0.0.1;charset=UTF8');
and
define('AUTHTOKEN', 'xxxxxxxxxxxxxxxxxxxxxx');
the token is needed and can be created online on https://data.eppo.int
Execution of the script
For MySQL
# php client_mysql_v1.php
For postgresql
# php client_pg_v1.php