Work History
Developed a POC application for certifying Citi's internal IT solutions using Python FastAPI and Appian.
The app implements a multistage workflow that resembles Citi internal certification process. The first stage is the collection of data from internal Citi Sources which are then validated against a proprietary technology for Risk and Controls validation. The following stages represent the internal approval chain with different actors involved.
Led automation tools development for generating PowerPoint and Word reports, including an automated screen capture scraper for image collection.
Worked on a project where a booking system developed in a mix of Java, MuleSoft Enterprise Service Bus (ESB) and NodeJS was being ported to a server-less, event-driven, Python AWS Lambda architecture.
I have personally written key parts of the architecture for a File Processing system, an OOP integration with Oracle db and laid out the foundations for integration testing.
I worked on a project with Zaizi and The National Archive in which I prototyped a
Django Application integrated with a Keycloak Server for authentication and authorisation. I have
implemented the application code and custom docker images for both resources.
Code is publicly available at
I was part of the ISC (International Supply Chain) team, where I helped build internal tools for order
and transport management. The architecture was heavily based on GraphQL APIs and Kafka
workers.
Part of Database-Storage-Infra-Engineering team. I worked on platform automation solutions to
improve the reliability of JET's infrastructure. I used a wide range of technologies that are common
in modern tech stacks. The last project I worked on before leaving was an automated pipeline built
in Ansible, used to deploy an AWS RDS Database with Auditing option enabled. This was crucial to
the business as it was a requirement for SOX compliance.
I joined Sainsbury during the transition from monolithic to micro-services architecture. I had
the opportunity to learn cutting edge technologies such as Kubernetes and Kafka and I had a
lot of exposure to DevOps culture. I was part of the data product team and we were in charge
of providing data services to internal and external stakeholders. I worked on and successfully
delivered an ETL system that was critical to the business.
Mavens of London (now part of Kantar group) was a young and vibrant digital marketing agency
where I met many talented people and learned a lot about Python and coding in general. We used
to build web scrapers in Python, sometime also in NodeJS, and I have been involved in many ML
and data driven projects. I had the chance to learn Pandas and many libraries from the data-science eco-system. I worked with several frameworks such as Airflow, Dashplot.ly, Flask and partially also Django.
The last project I worked on before leaving the company was a concurrent
web scraping system with a REST API that Data Analysts could use to gather data at large scale.
My first job in the IT industry. I have been initially hired as Operations Engineer. I was in charge of
running a system designed to monitor mobile advertisements. It consisted of a NodeJS scraper, a
Python scheduler and some Python scripts for data processing. Here I had the opportunity to gain
hands-on experience with Python 2.7 and become familiar with Linux, bash terminal and Git. One
year into the role I progressed into a Junior Developer position where I finally built my first
software solution: A web scraper made in Python 2.7, Selenium and PhantomJS. It helped the
company to automate a task that was previously done manually and very time consuming.