About Me

Benjamin Dornel

Data professional with a background in tech communications consulting.
Passionate about combining qualitative and quantitative data to deliver powerful insights.

After graduating from the National University of Singapore in 2019 with a degree in Global Studies, I entered the field of technology public relations where I combined my training in global political and socio-economic issues with my interest in technology.

As a data-driven storyteller, I contributed heavily to internal business development through media sentiment analysis and share of voice research. I also drove media placements for technology companies across various platforms.
With my interest in data analytics growing, I decided to grow my technical skills by picking up Python and SQL. Following the completion of several coding projects, I made the decision to transition towards a career in data science.

To accelerate my development, I joined General Assembly's full-time Data Scicence Immersive. This intensive, four-month long program honed my abilities in various areas including Pandas, Numpy, machine learning, statistics, and predictive modelling toolkits like Scikit Learn.

I'm interested in all aspects of data science, but there are a couple of areas that I'd like to explore further.

  • Media Analytics: With social media platforms rising to become primary providers of news and entertainment content, I believe that it's imperative that businesses use data analysis and modelling to cut through the chaos and connect with their target audience - be it with ads or stories.

    I also believe that ordinary people should have access to automated tools that can separate fake news from real news. With the rapid development of deep learning and natural language processing, it has also become increasingly possible to detect harmful misinformation at an early stage.

  • Data Engineering: Programming is one of things within data science that I find myself most drawn to. Before starting the data modelling process, it's vital to have a data pipeline that is consistent, reliable and fast. I constantly think about the best way to achieve this while keeping my code clean and interpretable.