fb-pixel

Location

Poland

Rate

$48  / per hour

Years of experience

8+

About

I am a passionate data professional with a robust background in ETL processes, data science, big data processing, and cloud computations. Over the past six years, I've honed my skills in Python, SQL, and various big data technologies such as Apache Spark and Hadoop. My experience spans across multiple industries, including banking, medical, e-commerce, and energy sectors. I've successfully designed and implemented data lakes, analytical platforms, and ELT processes using tools like Databricks, Airflow, and various AWS and Azure services. Additionally, I am proficient in creating and managing CI/CD pipelines and have a keen interest in optimizing processing costs and reducing downtimes. My passion for continuous learning is reflected in my deepening knowledge of big data technologies and my enthusiasm for football, the used-cars market, and Polish cuisine. In my recent role as a Contractor and Lead Data Engineer, I advised on data lake maintenance and expansion, developed processes and architectures for data lakes, and reduced AWS EMR processing costs by 25%. My responsibilities included integrating frameworks, troubleshooting, and creating scalable solutions in cloud environments. Prior to this, I served as a Big Data Developer at Lingaro, where I led projects, developed custom Apache Spark listeners, and built data processing engines. My tenure at PwC Advisory saw me orchestrating data workflows, supervising tasks, and optimizing storage and processing solutions. With a strong educational background in Big Data, Econometrics, and Mathematics from the Warsaw School of Economics and the University of Warsaw, I am well-equipped to tackle complex data challenges and contribute effectively to any data-driven organization.

Tech Stack

Big Data, Apache Spark, AWS, Azure, CI/CD, Databricks, Hadoop, Python, SQL

Experience

  • Developed and maintained data lakes and analytical platforms using Databricks on AWS and Azure, ensuring scalability, data security, and automation of infrastructure as code (IaC).
  • Reduced production AWS EMR processing costs by 25% and decreased downtime by 37% through effective optimization techniques, resource management, and configuration adjustments.
  • Designed and implemented efficient ETL/ELT processes using Apache Spark, Airflow, and Databricks, tailored to various industry requirements including banking, medical, and e-commerce sectors.
  • Utilized AWS and Azure services (S3, IAM, Lambda, EC2, RDS, DynamoDB, Kinesis, Glue, ADLS, EventHubs) to build robust cloud-based data solutions and frameworks.
  • Led project teams, distributed tasks, reviewed pull requests, and supervised the implementation of big data solutions, ensuring adherence to project timelines and quality standards.
  • Developed and maintained continuous integration and continuous deployment (CI/CD) pipelines for schema migrations, workflows, and cluster pools using tools like Git, Jenkins, Azure Repos, and Azure Pipelines.
  • Developed integration frameworks for FHIR format data and Azure Databricks, troubleshooting and optimizing Delta Live Tables jobs to ensure seamless data processing and integration.

Employment history

Contractor April 2021 – Present

Advised on Data Lake maintenance and expansion (banking sector, EU, as Lead Data Engineer):
• Apache Spark / Airflow / AWS development (process / code / architecture) for Data Lake.
• Built analytical platform upon Databricks on AWS (resolving matters like scalability/data security in the cloud/ IaC automatization)
• Reduced prod AWS EMR processing costs by 25% and decreased downtime by 37%.
Built a data processing framework for FHIR format compliant data (medical sector, US).
• Developed FHIR format – Azure – Databricks integration framework (also automated cucumber / pytest-bdd test framework)
• Troubleshot Delta Live Tables jobs
Implemented a PoC for Azure Databricks-based Data Lake (e-commerce, PL).
• Designed ELT processes (pyspark, Databricks Workflows).
• Created CICD processes for schema migrations, workflows, cluster pools, etc. Designed Apache Airflow architecture for an MFT business case (energy sector, PL).

Big Data Developer, Lingaro July 2020 – March 2021

Developed custom Apache Spark listeners (FMCG)
• Led project.
• Gathered logs produced by Spark jobs on Databricks.
• Visualized and pointed out weak spots, cost generators, and suboptimal queries.
Master Data Engineering (FMCG)
• Migrated SAP-based ETL to Microsoft Azure.
• Built from scratch data processing engine (Databricks + Airflow + ADLS + Docker).
• Built REST APIs connecting the engine’s components.

Data Engineer (Senior Associate) , PwC Advisory October 2019 – June 2020

Big Data Engineering (Financial Services)
• Developed a solution responsible for orchestrating workflows from data vendors (public and private sources, both structured and unstructured) to a machine learning engine.
• Reviewed pull requests, distributed tasks to subordinates, and supervised them.
• Planned and executed data migration from HDFS to Azure Blob Storage.
• Optimized Apache Spark jobs and HDFS storage.

Data Engineer (Associate) , PwC Advisory April 2018 – September 2019

Created store chain expansion model (Retail):
• Designed and implemented a machine learning workflow responsible for the prediction of store income based on geographical and internal data.
Cloudera Hadoop cluster administration:
• Configured nodes / roles, installed / updated software.
• Performance monitoring, and troubleshooting.
• Prepared and maintained a working environment for Data Scientists (JupyterHub, Cloudera Data Science Workbench, mlflow, RStudio Server, etc.)
• Completed Cloudera Administrator Training for Apache Hadoop

Business Analyst, Creamfinance Poland May 2016 – March 2018

• Developed KPIs tracking Shiny application
• Developed a process responsible for handling loans assignment to external debt collectors.
• Refactored an LGD calculation model from Excel based to a standalone Shiny dashboard.

Intern, Citi Service Center July 2015 – September 2015

• Assisted in the development and maintenance of various financial and operational reports, ensuring accurate data collection and presentation for internal review and decision-making processes.
• Conducted data analysis to support various departments, identifying trends and insights to aid in strategic planning and operational improvements.
• Provided administrative support to the team, including scheduling meetings, preparing documentation, and assisting with project management tasks to ensure smooth and efficient operations.

Education history

Warsaw School of Economics October 2018 Big Data - Master of Science degree
University of Warsaw October 2014 – July 2017 Bachelor of Science thesis: Analysis of dependencies between S&P 500, DAX and WIG20 changes.
University of Warsaw October 2013 – September 2016 Master of Science degree in Mathematics

We've delighted 83 clients with our IT recruitment and software development services.

Read about a few of them below...

DevsData LLC Reviews 5.0 stars 37 reviews
Powered byClutch logo
Powered byClutch logo
See more testimonials
Similar blind CVs/resumes
Poland

Swagger Developer Resume Sample

Senior Java Software Developer skilled in backend architecture, database management, and team mentoring.
Swagger-icon
Swagger
Angular-icon
Angular
Bash-icon
Bash
Docker-icon
Docker
Git-icon
Git
Hibernate-icon
Hibernate
Java-icon
Java
JavaScript-icon
JavaScript
Kafka
Kubernetes-icon
Kubernetes
Microservices
MySQL-icon
MySQL
Oracle
PL/SQL
Python-icon
Python
Spring Framework
Rate $25  /  per hour
Poland

CEO Resume Sample

A CEO over 15 years of leadership experience.
ERP
Asana
AWS-icon
AWS
CRM
Microsoft Azure
MS Office
Oracle
Salesforce
SAP
Slack
Rate $25  /  per hour
Poland

Rest API Developer Resume Sample

Full-stack developer with experience in building REST APIs, React front-ends, and integrating cloud services.
REST APIs
API-icon
API
AWS-icon
AWS
CSS-icon
CSS
Express.js-icon
Express.js
Firebase
GraphQL-icon
GraphQL
HTML-icon
HTML
JavaScript-icon
JavaScript
MongoDB-icon
MongoDB
NestJS
PostgreSQL-icon
PostgreSQL
React-icon
React
React.js
REST
Rate $17  /  per hour
Ukraine

Senior Android Developer Resume Sample

Senior Android Developer with expertise in Kotlin, app architecture, and complex backend integration.
Android-icon
Android
AWS-icon
AWS
C++-icon
C++
Git-icon
Git
Java-icon
Java
Jenkins-icon
Jenkins
Kotlin-icon
Kotlin
Mobile-icon
Mobile
Rate $15  /  per hour
UK

C Level Resume Sample

Experienced CTO with 20+ years in AI, data science, and tech leadership, driving innovation, growth, and sustainability.
Management-icon
Management
AWS-icon
AWS
Azure-icon
Azure
C#-icon
C#
C++-icon
C++
CSS-icon
CSS
Elasticsearch
Git-icon
Git
Github
HTML-icon
HTML
Java-icon
Java
JavaScript-icon
JavaScript
Jenkins-icon
Jenkins
MongoDB-icon
MongoDB
PHP-icon
PHP
Project Management
Project Planning
Python-icon
Python
SQL-icon
SQL
Rate $60  /  per hour
Poland

Chief Technology Officer Resume Sample

A results-driven technology executive with over a decade of experience.
Management-icon
Management
.NET-icon
.NET
Angular-icon
Angular
ASP.NET
C++-icon
C++
Docker-icon
Docker
Lead
Project Management
Python-icon
Python
React-icon
React
Rate $30  /  per hour
Poland

Work From Home Resume Sample

Senior .NET Developer with expertise in backend architecture, business analysis, and system integration.
.NET-icon
.NET
Angular-icon
Angular
ASP.NET
C#-icon
C#
Microservices
MS SQL
Typescript-icon
Typescript
Rate $15  /  per hour
Poland

Sample COO Resume

A COO with over 20 years of experience in banking.
ERP
CRM
Google Analytics
Jira
Marketing
MS SQL
Oracle
Power BI
Project Management
ServiceNow
Workday
Rate $20  /  per hour
Poland

Board Member Resume Sample

A highly experienced executive with over 20 years of leadership.
MS Office
Asana
ERP
Finance
Jira
Microsoft Excel
Oracle
Project Management
SAP
Workday
Rate $25  /  per hour

I agree to and accept that DevsData LLC will provide better user experience by collecting, analyzing and cataloging information about Internet electronic addresses that I have connected with my devices and about the type of my devices (such as the type and version of software) as well as by making automatic decisions (not involving sensitive data). The agreement applies for the legally binding period, or until either the user or DevsData LLC withdraws from the agreement. Withdrawing from the agreement will result in removing the user's data. Please see our privacy policy.

We use cookies to provide the best experience for you. >More about cookie policyarrow

Book a call with our team

For software development projects, minimum engagement is $15,000.

whatsapp
Prefer email?
Quote mark

Best back-end engineers I've ever worked with...​

“I interviewed about a dozen different firms. DevsData LLC is truly exceptional – their backend developers are some of the best I’ve ever worked with. I’ve worked with a lot of very well-qualified developers, locally in San Francisco, and remotely, so that is not a compliment I offer lightly. I appreciate their depth of knowledge and their ability to get things done quickly. “

Avatar

Nicholas Johnson

CEO of Orange Charger LLC,

Ex-Tesla Engineer,

Mentor at YCombinator

Success

Thank you


We'll get back to you within 1 business day.