Find Jobs
Hire Freelancers

Need Experienced Hadoop, Kafka, Spark Consultant & Architect

min $50 USD / hour

Closed
Posted over 5 years ago

min $50 USD / hour

Hi, We need Hadoop, Kafka, Spark consultant that have extensive real world production experience of deploying such system in DC/OS. The job will be advising our engineer to use industry best practice development and deployment of such system. The job include: - Architecture design/consultantion - Use case scenario/best practice - Design/algorithm review - Code review - Pair programming with our Engineer We are based on GMT+7 timezone so regular meeting/coaching schedule is necessary. Our expectations is bringing 2-5 different consultant with minimum 10 hour per week.
Project ID: 17572881

About the project

21 proposals
Remote project
Active 6 yrs ago

Looking to make some money?

Benefits of bidding on Freelancer

Set your budget and timeframe
Get paid for your work
Outline your proposal
It's free to sign up and bid on jobs
21 freelancers are bidding on average $53 USD/hour for this job
User Avatar
Hello, I am working in Bigdata/Hadoop technologies for years. I worked in Hadoop, Mapreduce, Kafka, Spark, Cassandra, Hive/HBase, ELK stack using Java, Scala and Python. Can we further talk in details? Thank you!
$55 USD in 40 days
4.9 (49 reviews)
5.5
5.5
User Avatar
Hi I am a data engineer with 4+ years of industry experience. I have been working on data analytics, big data and data warehousing solutions both on-premises and on clouds. I got expertise in following areas. 1. Spark pipelines for Data lakes solutions in AWS with EMR and datapipelines 2. Data validation, profiling, processing etc with Spark 3. Spark with Nifi and Kafka for twitter streaming applications 4. Adhoc reporting solutions using AWS EMR and Hive. 5. Hortonwork Data platform and hortonwork data flow distribution 6. Teradata Kylo with spark, Nifi, Kafka and Hive for data lakes 7. Teradata, IBM DB2 and AWS Redshift 8. Python, Java and Scala 9. Unit testing with Python(unittest), Scala(scalatic) 10. Scala build tool(SBT), Maven for dependencies management. 11. Devops, Jenkins and Dockers I can do this job for you according to your expectation. I can allocate 15 hours/week to you. Looking forward to hearing from you. Best Regards
$50 USD in 40 days
4.9 (23 reviews)
5.1
5.1
User Avatar
Hi, I have 7 years of experience and working on hadoop, spark, nosql, java, cloud... Done end to end data warehouse management projects on aws cloud with hadoop, hive, spark and presodb. Worked on multiple etl project like Kafka, nifi, flume, mapreduce, spark with XML/JSON., Cassandra, mongodb, hbase, redis, oracle, sap hana, ASE.... Many more. Let's discuss the required things in detail. I am committed to work done and strong in issue resolving as well. Thanks
$50 USD in 40 days
5.0 (8 reviews)
4.1
4.1
User Avatar
IT Offshore Development IT Offshore software application development company, Experts from India & United States accepting ambit of Skills accessible in our In-house staff: Dedicated:Android Developer,IOS Developer,Laravel Developer,Wordpress developer,Ruby on Rails Developer, Website Developer, ECommerce developer, PHP Developers, ASP.NET Developer, ASP Programmer, Web Designer, VB/VB.net Developer, Admin Staff and more… We always deliver solutions well within the timeline stipulated by our clients. Many businesses across the globe have already benefited a lot by taking services from us.
$55 USD in 40 days
5.0 (1 review)
1.6
1.6
User Avatar
Hi, I am a software professional with around 12 years experience in software development and test automation. I am specialised in below skills -Excel macro development -Development of Software Test Automation Framework and Test Script -Creation of Robotic Process Automation bot -Manual and Automation Testing Specialised in below tools -JAVA,SCALA -Kafka, GoldenGate, Jason -EXCEL Macro/VBA -HP UFT/QTP -TestComplete -Selenium, Cucumber -AutomationAnywhere, WorkFusion Domains -Healthcare -Investment Bank -Retail -Forex Specialised in below methodologies -Agile -Iterative -Waterfall I have worked with clients like NHS, United Health, BT, O2, Thales, Citigroup services in our career.
$50 USD in 40 days
5.0 (1 review)
1.6
1.6
User Avatar
Hood day. job responsibility is ok. what made you confident the tech stack. Have you really brainstormed enough. I doubt. i have contributed 17 years to industry and work as a digital architect for one of the top company. I take some out side assignments to meet my needs. I bellive you need to rethink what you are trying to achieve. let's connect and I can help you. Regards Neo
$55 USD in 10 days
0.0 (1 review)
0.0
0.0
User Avatar
Hello, I would appreciate your valuable time if we could arrange audio/video call to discuss technical aspects of the project posted here. We have extensively experienced developers to handle complex projects. I have pre-setup of a camera so you can monitor the development work. Can we start a technical discussion My Skype Id -aman1304 BR,
$55 USD in 40 days
0.0 (0 reviews)
0.0
0.0
User Avatar
Hello We are group of freelancers working extensively on Big Data Technologies and Hadoop ecosystem. We have more than 7 years of experience in this tech stack and worked on Apache Spark, Kafka, Hive, Hadoop, Habse, Mapreduce, Impala, Hortonworks, Cloudera, AWS services and DevOps engineering. We are new to this platform bu we have worked on projects with same tech stack: We are currently working for one of the Enterprise client where we are pulling data from different payload assets which goes from IoT Hub into Apache Spark, we process the data in Spark and after processing we dump it into Cassandra, redis, InMemory and Blob. We are also working on Data Warehousing project for Security and Law firm in London for Security Log generation and analysis. We are using Apache Kafka for data stream ingestion, we are processing this data stream with Apache spark then lookup is done using File and Hbase. We have developed dashboard for log monitoring using Kibana. We have worked on Data warehousing for Pharmaceuticals Industry where we need to collect and analyze data and come up with results like which particular area is having which particular disease most often. For this we are using Python as Scripting Language, Hive as RDS and AWS environment. So we are experts with Designing and architecting the database. Let's connect to have a technical call. Looking forward to your revert. Thank you.
$55 USD in 40 days
0.0 (0 reviews)
0.0
0.0
User Avatar
I have implemented large solutions for my employer with Hadoop,Spark, Kafka and also worked on multiple platforms databricks,cloudera,Amazon EMR. I have been playing as Architect and Lead role so I believe I can help your needs.
$55 USD in 40 days
0.0 (0 reviews)
0.0
0.0
User Avatar
I have an extensive experience of 2 years working in Big Data Analytics team at Amazon. I have worked on technologies such as - Hadoop, Spark, Scala, Pig, Hive and have independently designed & developed systems using these technologies.
$55 USD in 20 days
0.0 (0 reviews)
0.0
0.0
User Avatar
Experienced Hadoop Admin with HDP Certification Certified RHEL Admin Certified MongoDB Admin Devops Expert Will give free demo End to End Solution
$55 USD in 40 days
0.0 (0 reviews)
0.0
0.0
User Avatar
Having 2.11 years of IT experience in Application Development in Spark, Scala, Hadoop (HDFS) and its ecosystem(Hive & Sqoop). Having good knowledge on tools like Autosys - Autosys R11 ,Putty, Cloudera Manager, unravel , GIT, & ivy. Having good knowledge on HDFS, NameNode, DataNode, Resource Manager, NodeManger and there working functionality. Experience in writing Hive Queries to process data. Extensive working experience with Concepts like Scala and Spark. Experience in developing and the implementation of common MapReduce algorithms in Scala like Sorting and searching according to the client requirement. Worked on analyzing the text files, JSON files, semi structured data using Spark & Scala. Worked on Storing, Merging, Moving and Retrieving data in to HDFS using several Linux commands. Aware of configuration and working with Apache sqoop. Experience loading data to Hive partitions. Logical, Analytical and good interpersonal skills, Commitment to perform quality work. Very flexible and can work independently as well as in a team environment. Willing to update my knowledge and learn new skills according to business requirements.
$55 USD in 40 days
0.0 (0 reviews)
0.0
0.0
User Avatar
Hi , We are a group of technologists with experience in Big Data Technologies. We have implemented a Large-scale implementation of Hadoop projects on AWS using Horton Works for a leading healthcare provider. This included micro-services for a front end to ingest data from multiple sources and store in a Data Lake. Horton based Hadoop cluster with 20+ pipelines that will preprocess data and score machine learning models, store curated data in a DW ( Redshift) and used for analytics using DOMO. The pipelines were using Spark /scala and spring based micro services. I was the platform architect for the same. I believe we will be able to assist in providing design alternatives and their implications and also help with the implementation. Having already implemented similar projects, We will be able to guide so as to avoid the pitfalls.
$55 USD in 40 days
0.0 (0 reviews)
0.0
0.0
User Avatar
Hi, I am hadoop consultant, I have hands on experience in deploying hadoop cluster in DC and AWS. I am strong in configuring hadoop security. I worked as hadoop consultant for top banks and MNC's. Thanks.
$50 USD in 10 days
0.0 (0 reviews)
0.0
0.0

About the client

Flag of INDONESIA
Indonesia
0.0
0
Member since Nov 14, 2016

Client Verification

Thanks! We’ve emailed you a link to claim your free credit.
Something went wrong while sending your email. Please try again.
Registered Users Total Jobs Posted
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Loading preview
Permission granted for Geolocation.
Your login session has expired and you have been logged out. Please log in again.