No matter the size of your data, if you need to import it then your system must be robust.
At Elixir Conference, Developer, Laszlo Bacsi told us the story of how he built a data pipeline to process billions of data and they learnt quite a few lessons along the way!
It doesn't matter if you have "big data" or "small data" if you need to import and process it in near realtime you want to have a system that is robust and maintainable. This is where the failure tolerance and scalability of Erlang/OTP, the expressiveness of Elixir, and the flexibility of Flow and GenStage are all great assets.
This is a story of how we built a data pipeline to move and process billions of rows from MySQL and CSV files into Redshift and what we learned along the way.
This talk was given by Laszlo Bacsi at ElixirConf EU.