Cloud setup for Python 2.7 and Python 3.7 scrapers
$10-30 USD
Paid on delivery
I need to have our scrapers deployed to the cloud.
I am scraping data from a few sites and storing it in a MySQL database. I already have all the scrapers and the database.
I have the following 6 scripts:
1. Scraper for ksl .com/cars - BeautifulSoup - Python 3.7
2. Scraper for [login to view URL] .com - BeautifulSoup - Python 3.7
3. Scraper for craigslist - Scrapy - Python 2.7
4. Scraper for autotrader - Scrapy - Python 2.7
5. Script for eBay Motors API - Python 3.7
6. [login to view URL] - Python 2.7 - calls stored procedures and sends their results via email using SendGrid
(I already have tried setting things up on Google Cloud Platform using Google Compute Engine and Google Cloud SQL, but it isn't working very well).
Our database is MySQL 5.7 2nd Gen InnoDB.
The setup needs to have all 5 scripts constantly running (restart automatically upon finish).
The scripts need to be dormant (and not run) from 12:00 AM - 6:00 AM (GMT - 7:00).
I am open to using basically any cloud provider (Google, AWS, Azure, Digital Ocean, Heroku, etc.).
I am open to using docker containers.
The solution needs to support my running and testing it locally relatively easily.
The solution needs to support my pushing code changes to it relatively easily.
In your proposal, tell me:
- Exactly how you would architect the entire solution
- Why your solution makes sense / is the best
- How much experience you have with the technologies you'll use
- How many days it will take you to build it
If I award you the project, then I expect you to complete it in the timeframe you quote.
The setup will need to be well documented.
All of your code needs to be well commented.
Your code needs to have good error handling.
We will manage all code via GitHub.
At the end, you will need to walk me through the setup you build and show me how to use it.
Please feel free to ask any questions you need!
Project ID: #20782097
About the project
8 freelancers are bidding on average $57 for this job
- Architecture Compute: AWS Fargate to run your scrapers in containerised environment which gives you the option to run during certain time window. Database: AWS RDS Documentation and Resource orchestration: AWS Cloudf More
Hi sir, i am experienced in scraping, i did many similar jobs which you can see in my reviews, i have developed some codes already so the speed will be high for your job, would you please share the details?