Cloud setup for Python 2.7 and Python 3.7 scrapers

Closed Posted 4 years ago Paid on delivery
Closed Paid on delivery

I need to have our scrapers deployed to the cloud.

I am scraping data from a few sites and storing it in a MySQL database. I already have all the scrapers and the database.

I have the following 6 scripts:

1. Scraper for ksl .com/cars - BeautifulSoup - Python 3.7

2. Scraper for [login to view URL] .com - BeautifulSoup - Python 3.7

3. Scraper for craigslist - Scrapy - Python 2.7

4. Scraper for autotrader - Scrapy - Python 2.7

5. Script for eBay Motors API - Python 3.7

6. [login to view URL] - Python 2.7 - calls stored procedures and sends their results via email using SendGrid

(I already have tried setting things up on Google Cloud Platform using Google Compute Engine and Google Cloud SQL, but it isn't working very well).

Our database is MySQL 5.7 2nd Gen InnoDB.

The setup needs to have all 5 scripts constantly running (restart automatically upon finish).

The scripts need to be dormant (and not run) from 12:00 AM - 6:00 AM (GMT - 7:00).

I am open to using basically any cloud provider (Google, AWS, Azure, Digital Ocean, Heroku, etc.).

I am open to using docker containers.

The solution needs to support my running and testing it locally relatively easily.

The solution needs to support my pushing code changes to it relatively easily.

In your proposal, tell me:

- Exactly how you would architect the entire solution

- Why your solution makes sense / is the best

- How much experience you have with the technologies you'll use

- How many days it will take you to build it

If I award you the project, then I expect you to complete it in the timeframe you quote.

The setup will need to be well documented.

All of your code needs to be well commented.

Your code needs to have good error handling.

We will manage all code via GitHub.

At the end, you will need to walk me through the setup you build and show me how to use it.

Please feel free to ask any questions you need!

Python Web Scraping Software Architecture MySQL Cloud Computing

Project ID: #20782097

About the project

8 proposals Remote project Active 4 years ago

8 freelancers are bidding on average $57 for this job

joystick220

- Architecture Compute: AWS Fargate to run your scrapers in containerised environment which gives you the option to run during certain time window. Database: AWS RDS Documentation and Resource orchestration: AWS Cloudf More

$20 USD in 7 days
(34 Reviews)
6.2
ferozstk

Hello, After reading your project details I believe I'm suitable for this project. As I'm expert on it with more than 7 years experience. Please feel free to contact me. I am looking forward to hear from you. More

$30 USD in 1 day
(51 Reviews)
5.9
sepehrbg

Hi sir, i am experienced in scraping, i did many similar jobs which you can see in my reviews, i have developed some codes already so the speed will be high for your job, would you please share the details?

$10 USD in 7 days
(34 Reviews)
5.2
bluelagon

Hi, lets get it done. I have similar experience.

$30 USD in 2 days
(22 Reviews)
4.5
frire

hi we can use AWS ec2 or any other vm provider like gcp. i am suggesting aws ec2 as inbound costs are 0 and so no extra charges will be incurring. i've deployed multiple apps on aws so i can say this can be done very More

$30 USD in 7 days
(6 Reviews)
4.4
MrMenezes

Would use GClound + Kubernetes Host one server for each script separately (Ensuring isolation) Host a server to control and configure the schedule of each Scritp (Generating only one access point) Host a Kubedb with More

$300 USD in 15 days
(0 Reviews)
0.0