Snowflake pyspark jobs

Filter

My recent searches
Filter by:
Budget
to
to
to
Type
Skills
Languages
    Job State
    1,755 snowflake pyspark jobs found, pricing in USD

    I'm in need of a Machine Learning Engineer who can migrate our existing notebooks from RStudio and PySpark to AWS Sagemaker. Your task will be to: - Understand two models I have running locally. One is a Rstudio logistic regression model, and the other is a pySpark XGboost also running on local. - Migrate These two models to AWS SAGEMAKER. Data will be on S3 -Prepare models to run on sagemaker totally, so that we can do training and testing 100% on sagemaker.-Models are already running on a local computer, but I need to move them to Sagemaker 100%. Data is on S3 already. -You need to configure and prepare Sagemaker from end to end, and teach me how you did it, since I need to replicate it in another system. -I will give you the data and access to AWS Ideal Skills and...

    $224 (Avg Bid)
    $224 Avg Bid
    13 bids

    I need an Azure function to facilitate daily data transfer from Snowflake s3 storage to Azure blob storage. Key Requirements: - I have access to both the Snowflake s3 storage and the Azure blob storage. - The primary goal of this data migration is to integrate the data with other Azure services. - The data should be transferred on a daily basis. Ideal Freelancer: - Proficient in Azure Functions and Azure Blob Storage. - Previous experience working with Snowflake and its utility for data migration. - Strong understanding of real-time data transfer systems is an added advantage.

    $22 / hr (Avg Bid)
    $22 / hr Avg Bid
    48 bids

    I'm searching for a proficient Python Developer to construct a high-performance database proxy and load balancer. This system should cleverly guide requests to compute engin...queuing. Key Aspects of Project: - Develop a robust database proxy using Python - Implement intelligent request routing to minimize cost and maximize throughput - Guarantee low latency and minimal to zero queuing while executing queries The ideal candidate will have the following experience and skills: - Proficiency in Python - In-depth understanding of database management systems, particularly Snowflake - Experience with handling structured data - Expertise in developing scalable load balancer systems - Demonstrated experience with optimizing for cost and throughput Interested professionals are ...

    $3000 - $5000
    Featured Urgent Sealed NDA
    $3000 - $5000
    16 bids

    I'm looking for an experienced Oracle specialist with a sharp eye for detail to help me resolve syntax errors in my Oracle 11g code. Key responsibilities include: - Identifying and rectifying syntax errors in my existing Oracle codebase I'd prefer regular video calls for updates and progress reports. Your insights into improving my code's efficiency are also highly appreciated.

    $32 (Avg Bid)
    $32 Avg Bid
    25 bids

    The Data Engineer contractor role will be a project based role focused on migrating data pipelines from legacy infrastructure and frameworks such as Scalding to more modern infrastructure we support such as Spark Scala. This role will be responsible for: Analyzing existing data pipelines to understand their architecture, dependenci...Requirements The ideal candidate is a Data Engineer with considerable experience in migrations and Big Data frameworks. Must-Haves Scala programming language expertise Spark framework expertise Experience working with BigQuery Familiarity scheduling jobs in Airflow Fluency with Google Cloud Platform, in particular GCS and Dataproc Python programming language fluency Scalding framework fluency Pyspark framework fluency Dataflow(Apache Beam) framewor...

    $303 (Avg Bid)
    $303 Avg Bid
    8 bids

    Hi, Please apply only individual. Agency can apply but budget should not be more than mentioned. Role : GCP Engineer (OTP) Exp : 7 + yrs SHIFT: IST Cloud Storage Buckets, BigQuery (SQL, Data Transformations and movement) Airflow (python, DAGs), DBT IAM Policies PyCharm Databricks (pySpark), Azure DevOps Clear and confident communication

    $1424 (Avg Bid)
    $1424 Avg Bid
    8 bids

    I'm in need of a Machine Learning Engineer who can migrate our existing notebooks from RStudio and PySpark to AWS Sagemaker. Your task will be to: - Understand two models I have running locally. One is a Rstudio logistic regression model, and the other is a pySpark XGboost also running on local. - Migrate These two models to AWS SAGEMAKER. Data will be on S3 -Prepare models to run on sagemaker totally, so that we can do training and testing 100% on sagemaker.-Models are already running on a local computer, but I need to move them to Sagemaker 100%. Data is on S3 already. -You need to configure and prepare Sagemaker from end to end, and teach me how you did it, since I need to replicate it in another system. -I will give you the data and access to AWS Ideal Skills and...

    $400 (Avg Bid)
    NDA
    $400 Avg Bid
    3 bids

    The Data Engineer contractor role will be a project based role focused on migrating data pipelines from legacy infrastructure and frameworks such as Scalding to more modern infrastructure we support such as Spark Scala. This role will be responsible for: Analyzing existing data pipelines to understand their architecture, dependenci...Requirements The ideal candidate is a Data Engineer with considerable experience in migrations and Big Data frameworks. Must-Haves Scala programming language expertise Spark framework expertise Experience working with BigQuery Familiarity scheduling jobs in Airflow Fluency with Google Cloud Platform, in particular GCS and Dataproc Python programming language fluency Scalding framework fluency Pyspark framework fluency Dataflow(Apache Beam) framewor...

    $305 (Avg Bid)
    $305 Avg Bid
    10 bids

    I am in need of a competent Snowflake SQL developer with expert knowledge in translating SQL Server queries to Snowflake SQL. Your role will involve figuring out the data model from an analysis of the data. No ER diagrams will be provided, so a deep understanding of data modeling is essential. You should be willing to get on a call to do the job. You will be paid hourly for your time. You will not be provided direct access to the database however your advise will be executed on call

    $14 / hr (Avg Bid)
    $14 / hr Avg Bid
    20 bids

    I am looking for a dedicated specialist well-versed in using Databricks and PySpark for data processing tasks, with a primary focus on data transformation. With the provision of JSON format files, you will perform following tasks: - Carry out complex data transformations - Implement unique algorithms to ensure efficient data processing - Test results against required benchmarks Ideal Skills: - Proficient in Databricks and PySpark. - Must possess a solid background in data transformation. - Experience handling large JSON datasets. The end goal is to achieve seamless data transformation leveraging the power of Databricks and PySpark, enhancing our ability to make informed business decisions. Please provide your completed projects, and the strategies you've used ...

    $46 / hr (Avg Bid)
    $46 / hr Avg Bid
    29 bids

    ...functions to handle data quality and validation. -Should have good understanding on S3,Cloud Formation, Cloud Watch, Service Catalog and IAM Roles -Perform data validation and ensure data accuracy and completeness by creating automated tests and implementing data validation processes. -Should have good knowledge about Tableau, with creating Tableau Published Datasets and managing access. -Write PySpark scripts to process data and perform transformations.(Good to have) -Run Spark jobs on AWS EMR cluster using Airflow DAGs.(Good to have)...

    $1487 (Avg Bid)
    $1487 Avg Bid
    22 bids

    ...Stay current with new technology options and vendor products, evaluating which ones would be a good fit for the company Troubleshoot the system and solve problems across all platform and application domains Oversee pre-production acceptance testing to ensure the high quality of a company’s services and products Skill Sets: Strong development experience in AWS Step Functions, Glue, Python, S3, Pyspark Good understanding of data warehousing, Large-scale data management issues, and concepts. Good experience in Data Analytics & Reporting and Modernization project Expertise in at least one high-level programming language such as Java, Python Skills for developing, deploying & debugging cloud applications Skills in AWS API, CLI and SDKs for writing applications Knowledge...

    $718 (Avg Bid)
    $718 Avg Bid
    27 bids

    I am in need of a proficient PySpark coder to aid in debugging errors present within my current code. The main focus of this project is optimization and troubleshooting. Unfortunately, I can't specify the type of errors– I need a professional to help identify and rectify them. If you are an experienced PySpark coder with a keen eye for bug identification and problem solving, I'd appreciate your expertise.

    $7 - $20
    Sealed
    $7 - $20
    10 bids

    I am in need of a proficient PySpark coder to aid in debugging errors present within my current code. The main focus of this project is optimization and troubleshooting. Unfortunately, I can't specify the type of errors– I need a professional to help identify and rectify them. If you are an experienced PySpark coder with a keen eye for bug identification and problem solving, I'd appreciate your expertise.

    $7 - $20
    Sealed
    $7 - $20
    6 bids

    I'm searching for a PySpark expert who can provide assistance on optimizing and debugging current PySpark scripts. I am specifically focused on PySpark, so expertise in this area is crucial for the successful completion of this project. Key Responsibilities: - Optimizing PySpark scripts to improve efficiency - Debugging current PySpark scripts to resolve existing issues Ideal Candidate: - Proficient with PySpark - Experience in big data management, data ingestion, processing, analysis, visualization, and reporting - Strong problem-solving skills to identify and resolve issues effectively - Knowledgeable in performance tuning within PySpark.

    $102 (Avg Bid)
    $102 Avg Bid
    65 bids

    I'm looking for a skilled freelancer to create a Spark script that transfers data from a Hive metastore to an S3 bucket. The goal of this project is to enable backup and recovery. Skills and Experience: - Proficiency in Spark and Hive - Extensive experience with S3 buckets - Understanding of data backup strategies Project Details: - The script needs to read the schema and perform metadata transfer for selected schema to s3 bucket. - Only bid if you have work experience with spark, hive, s3 - 4 schemas needs to be migrated - I have already got access to s3 configured - I have local instance of netapp s3 available and bucket created. - Server is Ubuntu

    $98 (Avg Bid)
    $98 Avg Bid
    10 bids

    I am looking for an experienced data analyst who is well-versed in PySpark to clean up a medium-sized dataset in a CSV file format. The file contains between 10k-100k rows, and your primary role will be to: - Remove duplicate data entries - Deduplicate the dataset - Handle missing values - Aggregate the resultant data Your proficiency in using PySpark to automate these processes efficiently will be critical to the success of this project. Therefore, prior experience in handling and cleaning similar large datasets would be beneficial. Please note, this project requires precision, meticulousness, and a good understanding of data aggregation principles.

    $25 (Avg Bid)
    $25 Avg Bid
    9 bids

    This vital task entails cleaning and sorting two CSV files of approximately 100,000 rows and second one of about 1.5million rows using pyspark (Python) in Jupyter Notebook(s). The project consists of several key tasks: Read in both datasets and then: - Standardizing data to ensure consistency - Removal of duplicate entries - Filtering columns we need - Handling and filling missing values - Aggregating data on certain groupings as output Important requirement: I also need unit tests to be written for the code at the end. Ideal Skills: Candidates applying for this project should be adept with Pyspark in Python and have experience in data cleaning and manipulation. Experience with working on datasets of similar size would also be preferable. Attention to detail in ensuring ...

    $181 (Avg Bid)
    $181 Avg Bid
    57 bids

    ...am seeking a creative and experienced graphic designer to create a minimalist logo. Key project details: - Color Scheme: Incorporate shades of blue and red into the design. (Clear Back ground) - Logo Style: Lean into a minimalist approach, less is more. - Imagery: Explore the integration of symbols related to air conditioning and comfort. This may include, but does not limit to wind, flame, snowflake, sun. I am open to alternative suggestions that may better capture the essence of air conditioning and comfort. Ideal candidate would have a strong portfolio of minimalist logo designs, and ability to create visually appealing logos that represent the company's services effectively. A background in designing logos for the HVAC or service industries would be a definite advan...

    $100 (Avg Bid)
    Guaranteed
    $100
    782 entries

    ...effectively parse links from Outlook emails, download CSV files from those links, and seamlessly add the content into my SnowFlake DB using FiveTran. The project involves: - Parsing links from specific Outlook emails - Downloading the linked CSV files - Uploading these downloaded files in SnowFlake DB via FiveTran without making any modifications to the CSV data The emails in question follow a specific format, hence, a clear pattern should be identifiable. The CSV files do not require any changes before being imported to the database. Since this project involves dealing with specific platforms, ideally, the successful candidate should possess: - A strong understanding of SnowFlake DB and FiveTran - An ability to effectively work with Outlook email and CSV files ...

    $533 (Avg Bid)
    $533 Avg Bid
    33 bids

    I am looking to streamline the process of extracting sales data from our Clover POS system and inserting it directly into our Snowflake data warehouse. This project involves utilizing FiveTran as the data integration tool and leveraging Azure Functions to automate the process. The aim is to create a seamless, automated data flow that allows for real-time analytics capabilities. **Requirements:** - Experience with Clover POS's API for data extraction. - Proficient in configuring and managing FiveTran for data integration. - Skilled in utilizing Azure Functions for automation between Clover and Snowflake. - Familiarity with Snowflake's data warehousing solutions and the ability to efficiently manage and organize data within it. - Ability to ensure data security and i...

    $514 (Avg Bid)
    $514 Avg Bid
    26 bids

    I'm seeking an experienced Data Engineer with proficiency in SQL and PySpark. Key Responsibilities: - Develop and optimize our ETL processes. - Enhance our data pipeline for smoother operations. The ideal candidate should deliver efficient extraction, transformation, and loading of data, which is critical to our project's success. Skills and Experience: - Proficient in SQL and PySpark - Proven experience in ETL process development - Previous experience in data pipeline optimization Your expertise will significantly improve our data management systems, and your ability to deliver effectively and promptly will be highly appreciated.

    $92 (Avg Bid)
    $92 Avg Bid
    17 bids

    - Conversion of the entire Python code into PySpark. Skills and experience required: - Proficient knowledge in Python.

    $26 (Avg Bid)
    $26 Avg Bid
    26 bids

    ...competent in either PySpark or RDD, using Python to create versatile code fitting for several scenarios. Your main task will be to write code to compare rows using Python in line with the clear set of rules I provide. These rules are detailed in an attached Word document and are based on comparisons encompassing specific columns, presence or absence of particular data, and multiple criteria comparisons. The expected output is a reversal logic for claim_opened_timestamp_utc. I need output that are in right side. I need either in pyspark or in rdd to compare rows. spark - spark-3.3.0-bin-hadoop3 py4j-0.10.9.5 I am using I need your support till I execute it in my office computer I need it in 3 days. Ideal Skills and Experience: - Proficiency in Python - Experience with...

    $156 (Avg Bid)
    Urgent
    $156 Avg Bid
    16 bids

    I'm beginer user of Azure Databricks and Pyspark. I'm looking to boost my skills to the next level and need an expert to guide me through advanced techniques. Ideal freelancers should have vast experience and profound knowledge in data manipulation using Pyspark, Azure Databricks, data pipeline construction, and data analysis and visualization. If you've previously tutored or mentored in these areas, it'll be a plus.

    $12 / hr (Avg Bid)
    $12 / hr Avg Bid
    4 bids

    I need complete 2 small projects done. The data needs to be pulled from API using python. The pulled data needs to be unnested, then transformed to answer some insights with medallion architecture. Here, you need to showcase SCD-type 2 ingestions, incremental joins,...to be pulled from API using python. The pulled data needs to be unnested, then transformed to answer some insights with medallion architecture. Here, you need to showcase SCD-type 2 ingestions, incremental joins, managing PII information, aggregation. Final deliverable needed for 1st project (databricks): Data model designed and architecture overview Notebooks of transformations in Python and PySpark/Spark Scala Final deliverable needed for 2nd project (dbt): Data model designed and architecture overview dbt sql and ...

    $276 (Avg Bid)
    $276 Avg Bid
    16 bids

    I need a skilled developer to create an OpenAI based question-and-answer application. AI am interested in a few key capabilities: - Natural language processing - Contextual understanding - Entity extraction The application front end should be in React and backend and core logic in Python. Data should be stored in snowflake. The task is to upload files (pdf and word documents). Perform document storage, document processing, OCR if embedded image in document, chunking, create a vectorstore database. Next the user should be able to ask questions related to the document. Store the results in the database. User should have the ability to give a thumbs up or down to the results. The project needs to be completed in 3 weeks or sooner, if it goes well then we can collaborate...

    $924 (Avg Bid)
    $924 Avg Bid
    93 bids

    I need a skilled developer to create an OpenAI based question-and-answer application. AI am interested in a few key capabilities: - Natural language processing - Contextual understanding - Entity extraction The application front end should be in React and backend and core logic in Python. Data should be stored in snowflake. The task is to upload files (pdf and word documents). Perform document storage, document processing, OCR if embedded image in document, chunking, create a vectorstore database. Next the user should be able to ask questions related to the document. Store the results in the database. User should have the ability to give a thumbs up or down to the results. The project needs to be completed in 3 weeks or sooner, if it goes well then we can collaborate...

    $1240 (Avg Bid)
    $1240 Avg Bid
    101 bids

    Looking for someone with good skills in Airflow, Pyspark and SQL.

    $251 (Avg Bid)
    $251 Avg Bid
    13 bids

    Come do interesting GIS programing in a database. *** Please no AI responses. *** Experience with snowflake and other big data a plus. Able to advise and write. Able to work TODAY. Please respond by telling me what is a 3-4. **Requirements:** - Proficiency in spatial SQL -> MUST for complex data query and retrieval. The ability to perform spatial analyses and geometric operations is crucial. - Expertise in creating engaging, interactive maps that can visualize spatial data and analyze spatial patterns on a global scale. This would involve dealing effectively with diverse data sets and potentially large volumes of data. - Strong background in PostgreSQL and PostGIS, with proven experience in writing advanced spatial queries. - Experience in web map development, wi...

    $26 / hr (Avg Bid)
    $26 / hr Avg Bid
    20 bids

    I am looking for a skilled professional in Python, with a comprehensive understanding of PySpark, Databricks, and GCP. A primary focus of the project is to build a data pipeline and apply time series forecasting techniques for revenue projection, using historical sales data. Key tasks will include: - Constructing a robust data pipeline using Python, PySpark, and Databricks. - Applying time series forecasting to produce revenue predictions. - Using Mean Squared Error (MSE) to measure model accuracy. The ideal candidate for this project would have: - Proven experience with Python, PySpark, Databricks, and GCP. - Expertise in time series forecasting models. - Practical understanding and use of Mean Squared Error (MSE) for model accuracy. - Experience with large scale ...

    $11 / hr (Avg Bid)
    $11 / hr Avg Bid
    14 bids

    I am looking to develop a sophisticated and efficient data pipeline for revenue forecasting. This pipeline will be implemented using Python, pyspark, databrics, and gcp Big Data. Here is what you need to know about this task: - Data Source: The data originates from Google Cloud Platform's Big Data service. As such, the freelancer should have solid experience and understanding of working with Big Data services on GCP. - Data Update Frequency: The frequency of data updates will be confirmed during the project, but suffice to say frequency could be high. Prior experience with real-time or near-real-time data processing will be highly beneficial. - Performance Metrics: The key performance metric I'm focusing on is data processing speed. The freelancer should have a strong kn...

    $18 / hr (Avg Bid)
    $18 / hr Avg Bid
    13 bids

    I'm in need of a specialist, ideally with experience in data science, Python, PySpark, and Databricks, to undertake a project encompassing data pipeline creation, time series forecasting and revenue forecasting. #### Goal: * Be able to extract data from GCP BigData efficiently. * Develop a data pipeline to automate this process. * Implement time series forecasting techniques on the extracted data. * Use the time series forecasting models for accurate revenue forecasting. #### Deadline: * The project needs to be completed ASAP, hence a freelancer with a good turnaround time is preferred. #### Key Skill Sets: * Data Science * Python, PySpark, Databricks * BigData on GCP * Time series forecasting * Revenue forecasting * Data Extraction and Automation Qualification in...

    $18 / hr (Avg Bid)
    $18 / hr Avg Bid
    15 bids

    I'm in need of a distinguished SQL developer with an adept background in ETL Testing within Snowflake. The successful candidate will be responsible for: - Carrying out efficient ETL Testing in Snowflake. This will involve knowledge in specific areas such as data extraction, data transformation, and data loading. - Execution and validation of SQL Test Cases. This specifically includes conducting data integrity tests and data retrieval tests. The ideal candidate should have proven experience in SQL development, ETL testing and knowledge in the Snowflake environment. An understanding of database design, query optimization, and database administration will be highly beneficial. The candidate should be detail-oriented and should have a keen eye for identifying and a...

    $1197 (Avg Bid)
    $1197 Avg Bid
    7 bids

    I am looking for a developer to create an AWS Glue and Pyspark script that will strengthen the data management of my project. The task involves moving more than 100GB of text data from a MySQL RDS table to my S3 storage account, on a weekly basis. Additionally, the procured data needs to be written on parquet files, for easy referencing. The developer will also need to send scripts to deploy the AWS Glue pipelines on Terraform, fitting all parameters. Skilled expertise in AWS Glue, PySpark, Terraform, MySQL and experience in handling large data is required. There is no compromise on the quality and completion timeline. Effective performance on this project will open doors to more work opportunities on my various projects.

    $41 (Avg Bid)
    $41 Avg Bid
    15 bids

    ...need an expert in Snowflake and Streamlit who can assist me in seamlessly integrating them. Your main task will be to draw upon your understanding of data structures, including numerical, textual, and datetime data, to create an interactive web application. Key Responsibilities: * Design and implement an efficient Snowflake data model. * Develop interactive web applications using Streamlit that cater to numerical data, textual data, and datetime. * Integrate Snowflake and Streamlit to form a functional, streamlined system. Ideal Skills: * Solid experience in Snowflake data modeling * Proficiency in creating web applications with Streamlit * Proven experience in working with different types of data: numerical, textual, and datetime * Previous work in ...

    $119 (Avg Bid)
    $119 Avg Bid
    1 bids

    I'm searching for a skilled data analyst who can help me perform an Extract, Load, Transform (ELT) task on my internal database. The chosen freelancer should: - Be able to handle a medium to high data volume (1 GB to 10 GB). - Have experience in extracting data from multiple sources. - Be familiar with loading data into Snowflake. - Understand how to transform data using HiTouch (reverse ETL). - Be efficient with data analysis using Mixpanel In-depth knowledge of Google Analytics is also beneficial. I expect you to have demonstrable experience in ELT tasks and be capable of delivering quality results accurately and timely. Your understanding of these processes will be crucial to the success of this project.

    $671 (Avg Bid)
    $671 Avg Bid
    21 bids

    I am seeking a skilled professional proficient in managing big data tasks with Hadoop, Hive, and PySpark. The primary aim of this project involves processing and analyzing structured data. Key Tasks: - Implementing Hadoop, Hive, and PySpark for my project to analyze large volumes of structured data. - Use Hive and PySpark for sophisticated data analysis and processing techniques. Ideal Skills: - Proficiency in Hadoop ecosystem - Experience with Hive and PySpark - Strong background in working with structured data - Expertise in big data processing and data analysis - Excellent problem-solving and communication skills Deliverables: - Converting raw data into useful information using Hive and Visualizing the results of queries into the graphical representation...

    $17 / hr (Avg Bid)
    $17 / hr Avg Bid
    15 bids

    ...currently searching for an experienced AWS Glue expert, proficient in PYsPARK with data frames and Kafka development. The ideal candidate will have: • Expertise in data frame manipulation. • Experience with Kafka integration. • Strong PYsPARK development skills. The purpose of this project is data integration, and we will be primarily processing data from structured databases. The selected freelancer should be able to work with these databases seamlessly, ensuring efficient and effective data integration using AWS Glue. The required work would involve converting structured databases to fit into a data pipeline, setting up data processing, and integrating APIs using Kafka. This project requires a strong background in AWS Glue, PYSPARK, data frame ...

    $235 (Avg Bid)
    $235 Avg Bid
    24 bids

    I'm seeking assistance to develop a Python-based solution utilizing PySpark for efficient data processing using the Chord Protocol. This project demands an intermediate level of expertise in Apache Spark or PySpark, combining distributed computing knowledge with specific focus on Python programming. Key Requirements: - Proficiency in Python programming and PySpark framework. - Solid understanding of the Chord Protocol and its application in data processing. - Capable of implementing robust data processing solutions in a distributed environment. Ideal Skills and Experience: - Intermediate to advanced knowledge in Apache Spark or PySpark. - Experience in implementing distributed file sharing or data processing systems. - Familiarity with network communicati...

    $545 (Avg Bid)
    $545 Avg Bid
    38 bids

    I am grasping for a chance to develop further my existing understanding of the Snowflake data warehousing platform. My primary area of interest is learning how to ingest data on this platform. I have an external Snowflake share with about 34 tables/views for this practice which is refreshed daily. I wish to create a permanent database in my account from the external share. Unfortunately, if there are edits to the shared data, they overwrite, the prior day. So I want to make sure my account has a history to catch changes I wish to automate the daily ingestion of this shared data. Once done, we can move on to making a second basic reporting database in my account which Power Bi and other tools can leverage. The tutorial will ideally include hands-on training. The expert I...

    $20 / hr (Avg Bid)
    $20 / hr Avg Bid
    7 bids

    I am seeking a skilled freelancer to develop a Snowflake-based data sharing framework in snowpark worksheet focused primarily on sharing data with external parties efficiently with python code. The core objective is to enable our organization to share a range of data securely and dynamically with our partners and clients. **Key Requirements:** - Implement a role-based access control (RBAC) system to manage data access meticulously, ensuring only authorized users can access specific datasets. - Develop functionality to support the sharing of semi-structured data formats, specifically JSON and XML, preserving the integrity and structure during the sharing process. **Ideal Skills and Experience:** - Proficient in Snowflake data warehousing solutions with a strong understanding...

    $10 / hr (Avg Bid)
    $10 / hr Avg Bid
    6 bids

    ...Professional with strong expertise in Pyspark for a multi-faceted project. Your responsibilities will extend to but not limited to: - Data analysis: You'll be working with diverse datasets including customer data, sales data and sensor data. Your role will involve deciphering this data, identifying key patterns and drawing out impactful insights. - Data processing: A major part of this role will be processing the mentioned datasets, and preparing them effectively for analysis. - Performance optimization: The ultimate aim is to enhance our customer targeting, boost sales revenue and identify patterns in sensor data. Utilizing your skills to optimize performance in these sectors will be highly appreciated. The ideal candidate will be skilled in Hadoop and Pyspark wi...

    $463 (Avg Bid)
    $463 Avg Bid
    25 bids

    I am looking to streamline the process of extracting sales data from our Clover POS system and inserting it directly into our Snowflake data warehouse. This project involves utilizing FiveTran as the data integration tool and leveraging Azure Functions to automate the process. The aim is to create a seamless, automated data flow that allows for real-time analytics capabilities. **Requirements:** - Experience with Clover POS's API for data extraction. - Proficient in configuring and managing FiveTran for data integration. - Skilled in utilizing Azure Functions for automation between Clover and Snowflake. - Familiarity with Snowflake's data warehousing solutions and the ability to efficiently manage and organize data within it. - Ability to ensure data security and i...

    $506 (Avg Bid)
    $506 Avg Bid
    24 bids

    I am seeking a freelancer who is not only SnowPro Core Certified but also has a wealth of experience in utilizing Snowflake for various data-centric projects. The purpose behind hiring a certified professional is to ensure that we have the right talent to tackle specific challenges related to data migration, data modeling and optimization, and improving query performance. **Key Requirements:** - **SnowPro Core Certification** is mandatory. - Proven experience in **data migration projects**, showcasing the ability to transfer data efficiently between different systems. - Expertise in **data modeling and optimization**, with the ability to design effective data storage, retrieval, and caching solutions. - Strong skills in **query performance tuning**, ensuring that data queries are...

    $23 / hr (Avg Bid)
    $23 / hr Avg Bid
    12 bids

    ...analytical databases - specifically Cassandra, BigQuery, Snowflake, and Redshift. Key Responsibilities: - Research, understand, and articulate the distinct approaches of and the other specified databases - Translate complex concepts into clear, concise, and reader-friendly articles Ideal Candidate Should Have: - You need to have very deep expertise in databases and distributed systems. - Ideally, a Ph.D. or some deep research writing experience. Any conference publications in top conferences are a plus. - An understanding of database architectures - Prior experience writing technical articles for a technical audience - The ability to explain complex topics in an easy-to-understand manner - Knowledge about Cassandra, BigQuery, Snowflake, and Redshift will be a big plus....

    $19 / hr (Avg Bid)
    $19 / hr Avg Bid
    21 bids

    Build a glue etl using pyspark to transfer data from mysql to postgres. facing challenges in column mappings between the 2 sources, the target database has datatypes enums and text arrays. should solve the erros in column mappings Should have prior experience ingesting data into postgres enum datatype

    $22 / hr (Avg Bid)
    $22 / hr Avg Bid
    54 bids

    ? Job Posting ? Position: Part-Time Data Engineer / Data Science Instructor ? Location: Remote (DMV area preferred) ? Schedule: Tuesday & Thursday 8 to 10PM and Saturday 10AM to 12PM EST ? Responsibilities: - Deliver engaging and informative data engineering lectures. - Experience in Azure Data Factory, Databricks, Snowflake - Design and implement hands-on exercises and projects. - Provide constructive feedback and support to students. - Stay updated on industry trends and technologies. ? Requirements: - Proven experience in data engineering. - Strong knowledge of relevant tools and languages (e.g., Python, SQL, ETL, Tableau, Power BI, Machine learning) - Excellent communication and teaching skills. - Bachelor's degree in a related field (Master's preferred). ...

    $5 / hr (Avg Bid)
    $5 / hr Avg Bid
    8 bids

    ? Job Posting ? Position: Part-Time Data Engineer / Data Science Instructor ? Location: Remote (DMV area preferred) ? Schedule: Tuesday & Thursday 8 to 10PM and Saturday 10AM to 12PM EST ? Responsibilities: - Deliver engaging and informative data engineering lectures. - Experience in Azure Data Factory, Databricks, Snowflake - Design and implement hands-on exercises and projects. - Provide constructive feedback and support to students. - Stay updated on industry trends and technologies. ? Requirements: - Proven experience in data engineering. - Strong knowledge of relevant tools and languages (e.g., Python, SQL, ETL, Tableau, Power BI, Machine learning) - Excellent communication and teaching skills. - Bachelor's degree in a related field (Master's preferred). ...

    $26 / hr (Avg Bid)
    $26 / hr Avg Bid
    21 bids

    ...individual to assist with enhancing my company’s data warehouse. I'm specifically looking for expertise in a few critical areas: - **Data Integration & Transformation:** - Unify disparate sources - Cohesive data flow creation - ETL processes for structured data - **Data Modeling & Design:** - Implement Snowflake Schema efficiently - Optimize for scalability and performance **Ideal Skills and Experience:** - Proficiency in SQL and ETL tools - Experience with Snowflake Schema modeling - Strong understanding of structured data integration - Familiarity with data warehousing best practices By applying your knowledge and skills, you will directly contribute to the robustness and intelligence of our data warehousing solutions. Looking forwar...

    $77 (Avg Bid)
    $77 Avg Bid
    11 bids