6 years and counting … the BizzTreat journey to the Bizzflow and beyond…

Jiří Tobolka
3 min readMay 3, 2021

--

The journey

This month is going to be six years since we have started BizzTreat. Since day one, we have been focused on professional services for our clients. Our transparency, independence, and fast time to delivery have been our key strengths. We are now a team of 30 great people. We love this journey.

We have always been working with the best tools on the market (or, to be more precise — those that suit us and our client’s needs the best) like Keboola Connection and GoodData. Later on, we have added Tableau and then Power BI to have more options in the data visualization tooling.

More than two years ago, something changed. We have started developing our ETL tool. Yes. Yet another ETL one would say … but … we always felt like we needed more independence not only on the data visualization but also within the whole data delivery process and tooling. We have been thinking about it since the beginning. Then “suddenly” We have been contacted by one company interested in working together with us, but there was one condition…

“We cannot use any new (3rd party) tool; it takes too much time to get it approved. Anyway, we have AWS, and we can spin up everything you need to get it done.”

Long story short. We have done it. This was the very initial version of our BizzFlow, deployed to the customer’s cloud environment. Our data pipeline concept was born. We all love LEGO!

During this journey, we realized what current cloud environments are about. It’s like LEGO bricks. You can fit it into your company data culture or start building it.

BizzFlow is a lightweight concept that leverages existing cloud infrastructure features/modules + Airflow as an orchestration engine. It is open so you can be creative and develop your own “extension” when needed. It is built on the top of high-performance analytical databases like Snowflake or BigQuery. Plus, we have imprinted our own best practices in it. All of this is deployed seamlessly to the customer’s cloud.

It uses the latest infrastructure techs like Docker and Terraform etc. We also started experimenting with the ELT approach plugging in DBT or other similar tools. Once you have a use case for that and it makes sense, you know.

What’s on the horizon

We see that a lot of (also smaller) companies start thinking about other data use cases than only internal analytics. They want to provide data to their customers or even want to create data products. We feel there is a lot of activity in this area in the last few months.

We like GoodData’s CN (cloud-native) offering and the whole DaaS (data-as-a-service) vision, which perfectly covers how we think about the data environment. So thanks for bringing it on and naming it!

Thanks to that, we can spin up the cloud environment for our client and deploy both data pipeline and data analytics/viz platform in one place.

Having everything in your cloud infrastructure brings you a significant advantage. You can boost power when needed, and you have more freedom and flexibility. Yes, you need some dev-ops people, but you are ready to go once you have them. As I have mentioned, think about the LEGO bricks!

Happy about this direction, it reflects our long-term view, and we can’t wait where it takes us.

Cheers! JT.

--

--