|
... |
|
# TradeAlert Backend
|
|
\ No newline at end of file |
|
### Introduction
|
|
|
|
This section of code is written primarily for handling all data the TradeAlert frontend has to deal with from the stores - finding URLs, gathering product details of these, and combining / fixing product IDs when a match is found.
|
|
|
|
|
|
|
|
### Requirements
|
|
|
|
In order to run the backend for this project, the following are required:
|
|
|
|
* Python3 (Python2 may work, however has only been tested with Python3)
|
|
|
|
* Server running Linux / Windows with a network connection
|
|
|
|
* Python3-Pip (to install some modules)
|
|
|
|
|
|
|
|
The modules in use across of this project are as follows:
|
|
|
|
* BeautifulSoup4
|
|
|
|
* UrlLib2
|
|
|
|
* Re (regex)
|
|
|
|
* MySQLdb
|
|
|
|
* sys
|
|
|
|
* SolrPy
|
|
|
|
|
|
|
|
## Installation
|
|
|
|
Once all of the required modules above have been installed, all of the Python (.py) files are required to be placed into the same directory, or remain in nested directories if applicable. The file config.py is used in order to set the database connection details for both MariaDB/MySQL, and for the Solr instance. Once these have been set, the scripts can be run either through command line directly, or scheduled as cron jobs, to be called on a schedule. |
|
|
|
\ No newline at end of file |