Howdy Logo
Image of Fabio R.

Fabio R.
Data Engineer

Java
Python
Mongodb
Bio

Data Tech Lead with substantial professional experience, demonstrating a robust background in technology. Specialized in cloud platforms and currently focusing on GCP. Holds a Google Data Engineer certification reflecting expertise and dedication in the field.

Has a diverse background as a software engineer with proficiency in Java, Python, and Ruby, applied across sectors such as telecommunications, oil & gas, mining, IT, and financial markets. Contributed to significant international projects, including participation in the International Olympic Committee for the Rio 2016 Olympics and leading data analysis for pipeline projects in the Netherlands.

Recognized for versatility and effective communication, with a distinctive investigative and analytical approach. Constantly in pursuit of intelligent solutions and passionate about uncovering valuable insights from data.

  • Data Coordinator
    12/1/2022 - Present

    Led a team of Platform Data Engineers, structuring the squad's roadmap and negotiating the backlog in collaboration with squad leaders. Ensured the quality and timely delivery of deliverables, maintaining adherence to projected costs and aligning team actions with strategic objectives. Facilitated agile practices through robust leadership in ceremonies and daily task monitoring. Engaged in the hiring process, boosted team engagement, and spearheaded retention efforts. Delivered consistent feedback via biweekly 1:1 meetings, aiding in the development and supervision of Individual Development Plans (IDP) for team members. Participated in in-depth technical discussions with data engineers, data architects, data analysts, developers, technical leaders, and business area representatives. Contributed to architecture discussions and technical solution definitions, providing technical support and undertaking high complexity technical demands. Developed proficiency in tools and frameworks such as Apache Hadoop, Spark, Kafka, and NoSQL databases. Demonstrated expertise in cloud-based data storage solutions like AWS S3, Redshift, and utilized monitoring tools such as Kibana and Prometheus for data pipeline management. Promoted code quality and version control via collaboration technologies including Git and Jenkins CI/CD pipelines.

  • Data Specialist
    5/1/2022 - 11/1/2022

    Led the development of a new data platform with a core objective of executing data ingestion from diverse sources including databases, files, streaming data, NoSQL databases, and APIs. The platform leveraged GCP cloud with a comprehensive stack of tools and technologies to ensure high efficiency and scalability. Proficiently utilized Composer for orchestration and workflow management, and Dataflow for robust data processing and transformation. Python was the primary programming language, complemented by BigQuery for advanced data storage, querying, and analysis.

    Implemented Pub/Sub for seamless event-driven communication, while Go addressed specific programming needs. Ensured continuous integration and delivery through Cloud Build, and managed relational databases with Cloud SQL. Kubernetes was employed for container orchestration, facilitating efficient application deployment and scaling. Maintained rigorous version control with Git and harnessed SQL along with NoSQL for handling both relational and non-relational data. Employed Terraform for precise provisioning and management of infrastructure, achieving a cohesive and efficient data platform environment.

  • Senior Data Engineer
    11/1/2020 - 5/1/2022

    Crafted robust data pipelines using PySpark on AWS, transforming and processing data efficiently. Spearheaded the architectural design of S3 storage solutions, successfully implementing comprehensive data workflows in AWS Glue. Extracted and migrated data from DynamoDB into Redshift, optimizing ETL processes for high performance. Developed Python-based Lambdas and utilized Kinesis Firehose for real-time data ingestion. Created insightful PowerBI reports, leveraging data extracted from AWS to drive business decisions.

    Specialized in building and maintaining both tabular and multidimensional cubes using SSAS, enhancing data analysis capabilities. Developed complex ETL packages with SSIS and engineered impactful reports through SSRS. Provided critical support to the data environment across Azure and AWS platforms, ensuring reliability and scalability.

    Leveraged a diverse array of tools and technologies including AWS Glue for efficient ETL workflows and data cataloging; DynamoDB for managing NoSQL databases; RabbitMQ for robust messaging solutions; and S3 for secure and scalable object storage. Developed Lambdas for serverless architecture and utilized AWS CloudWatch for comprehensive monitoring and log management. Employed Jupyter Notebooks on AWS for advanced data exploration and analysis. Implemented Apache Kafka for distributed data streaming and utilized EMR for large-scale data processing. Deployed scalable EC2 instances and integrated REST API services for streamlined communication between applications. Harnessed Kinesis for real-time data streaming and employed Redshift for data warehousing needs.

    Utilized AWS Athena for sophisticated data queries over S3, and implemented Zabbix for extensive network and application monitoring. Navigated both Linux and Windows server environments skillfully. Employed the Microsoft data platform, including SSIS, SSAS, SSRS, and SSMS, for end-to-end data solutions. Managed databases efficiently using SQL Server and leveraged Postman for comprehensive API testing and documentation. Mastered Python for scripting and automation tasks, employed Spark for intensive big data processing, and used Docker for streamlined containerization and deployment. Created compelling data visualizations and reports with PowerBI, and executed complex database queries with SQL, ensuring data integrity and accessibility.

  • Senior Software Engineer
    11/1/2018 - 11/1/2020

    Spearheaded architecture design for systems on Azure, including new demands in Java and Ruby systems supporting tax and IT sectors. Specialization in troubleshooting Java and Ruby systems, along with resolving Azure and database-related issues. Engineered and maintained ETL routines using Pentaho and SSIS, integrating diverse data sources such as SAP, Oracle, call systems, and budget systems to transform data for supply and taxation areas. Built and maintained Java Selenium robots for data extraction from internal systems, feeding ETL routines to generate KPIs for supply and taxation. Developed and sustained Java Selenium robots for invoice bookkeeping on government websites, along with Python bots for data extraction used in ETL routines to produce KPIs for supply and taxation.

    Technical expertise refined through extensive use of Azure, Pentaho, SSIS, and PowerBI, with Vaadin for web applications and SAP (Hana and ECC) for enterprise resource planning. Proficient in Java, Python, Ruby on Rails, and JavaScript, managing databases including Oracle, SQL Server, MySQL, and PostgreSQL. Adopted Kanban for project management and DBeaver for database management.

  • Systems Analyst
    6/2/2018 - 10/2/2018

    Responsible for gathering requirements and deploying a Business Intelligence (BI) system. Developed the architecture and designed ETL processes for various components including data sources, data warehouses, and data cubes. Conducted requirement gathering from departments involving patents, finance, and legal. Created dashboards featuring strategic company information. Performed reverse engineering of existing systems to document BI processes. Utilized market tools including Oracle Database 11, Pentaho, PL/SQL, Dashbox, Microsoft PowerBI, and AWS.

  • Telecom and Information Systems Analyst
    4/2/2017 - 6/2/2018

    Gathered requirements to implement a BI system, focusing on areas such as patents, finance, and legal. Set up architecture and ETL processes involving data sources, data warehouses, and cubes. Developed strategic dashboards to provide valuable company insights. Performed reverse engineering of the existing system to document BI processes. Utilized Oracle Database 11, Pentaho, PL/SQL, Dashbox, Microsoft PowerBI, and AWS to complete tasks and ensure system efficiency.

  • Systems Analyst
    4/2/2015 - 4/2/2017

    Possessing extensive experience as a Systems Analyst, demonstrated expertise in monitoring integration project tests within HR departments for major organizations, ensuring alignment with corporate objectives. Adept in unit testing and database administration, proficiently utilized Mantis Bug Tracker for issue tracking and effectively managed communication with software factory suppliers.

    As a BI Analyst, played a pivotal role in designing and executing BI projects for renowned international sporting events, excelling from requirements gathering to support and maintenance phases. Proficient in work estimation through function point analysis, consistently generated both technical and functional documentation to maintain project alignment. Showcased skills in creating informative dashboards and handling comprehensive ETL processes. Leveraged a variety of software and technologies including OBIEE, ODI, Oracle Database 11g, Oracle SQL Developer, SQL Developer Data Modeler, and Microsoft Project. Demonstrated adeptness in developing relational and multidimensional databases, and constructing OLAP cubes while managing data on Microsoft SQL Server, Linux, and Windows Server 2008 platforms.

    In the capacity of a Business Analyst, contributed to the success of .NET application development for large international organizations. Conducted thorough requirement gathering, data modeling with Oracle SQL Developer Data Modeler, and documented business processes using UML and prototyping techniques like Axure. Ensured quality and comprehension of project requirements among developers and clients, thereby enhancing project outcomes.

  • Business Intelligence Analyst
    1/2/2014 - 4/2/2015

    Gained expertise in customer data analysis, interpretation, and report building to support decision-making processes. Constructed dashboards utilizing business intelligence tools such as QlikView and Tableau. Developed, maintained, and documented ETL/OLAP processes and database structures. Managed and administered MySQL databases.

  • Data Analyst
    5/2/2011 - 10/2/2012

    Developed expertise in analyzing and interpreting graphs and data generated by the Pipeline Inspection Gauge (PIG) leveraging different technologies. Produced comprehensive reports in English, Spanish, and Portuguese detailing the physical conditions of oil pipelines, gas pipelines, slurry pipelines, and other similar infrastructures for South American clients. Enhanced data processing and project preparation capabilities to initiate data evaluations effectively. Applied data adequacy procedures, searched for pertinent information regarding clients' pipelines, and utilized SQL to query relevant data. Maintained communication with project managers in South American countries and proficiently queried data in the company's proprietary ERP system for thorough project analysis.

  • Development Intern
    2/2/2008 - 8/2/2008

    Developed expertise in deploying new environments and maintaining CVS. Managed the deployment and upkeep of JBOSS servers on Linux platforms. Customized Java frameworks for the Java Development team. Worked as a Java Web Programmer, updating and creating development projects for clients in the banking sector, utilizing Java EE technologies, JavaScript, and PL/SQL within the Eclipse IDE.

  • Bachelor's Degree in Information Systems at Estácio de Sá University
    2005 - 2009

  • Project Management at Universidade Federal Fluminense
    2015 - 2018

  • Google Cloud Professional Data Engineer at Google Cloud
    12/1/2022

Fabio is available for hire

Meet Fabio R.
Check icon

All Howdy Candidates are vetted for skills and english proficiency.