Accelerating the world’s transition towards data-centric decision making
Purpose and Vision
Efficient and fair exchange of data is the foundation of the digital economy, but currently faces overwhelming number of systemic barriers. Kamu was created by a passionate team of engineers and scientists to address unique challenges of cross-organizational data exchange using greenfield technology.
Our vision is the world where:
- Data plays a central role in decision making of businesses, governments, and individuals.
- High quality data is easily accessible to everyone to make fully informed decisions
- Within seconds new information contributes to our communal understanding of difficult problems
- Doubts about trustworthiness and reliability of data can be resolved in minutes
Team
Sergii Mikhtoniuk
Previously a Sr. Architect & Tech Lead at Activision-Blizzard, Sergii has put together the technological foundation of Kamu by combining his deep expertise in distributed, highly-scalable, near-real-time systems and enterprise data pipelines with his passion for Open Data and data-driven decision making.
Özge Nilay Yalcın, PhD
Holding a PhD in Cognitive Science and Artificial Intelligence, Nilay brings her extensive experience in leading diverse research groups, working with massive government and healthcare datasets, and creating state-of-the-art AI architectures that benefit people and the society.
Robert Delamar
Previously the CEO of BitTorrent, lawyer and a serial entrepreneur, Robert brings many years of leadership and business experience in deep-tech startups and a passion for decentralized data management systems.
Sergei Zaychenko, PhD
With a PhD in Computer Engineering, 20y of experience in leading engineering teams and architecting high-performance cutting-edge software solutions in Electronic Design Automation domain for worldwide giants in aerospace, automotive and defense, and 10y+ of teaching software engineers advanced design and development practices, Sergei is leading Kamu's development and research.
Dima Pristupa
Dima is an experienced engineer, programming language polyglot, and an ex startup founder. He has developed high-load systems in e-Government, low latency video streaming, and blockchain spaces. He is passionate about enriching the open-source community and creating simple, well-crafted solutions.
Olena Vodzianova
Olena is a highly skilled master in Economic Cybernetics with a decade of experience in diverse domains, including self-driving cars, connected data vehicles, and search intelligence; proficient in multiple programming languages such as Java, Scala, Clojure and Python, she is a dedicated data engineer with a passion for continuous learning.
Forrest DiPaola
Forrest holds a Master of Science in Geographic Information Systems (GIS), bringing expertise in data science, artificial intelligence, mapping, geospatial analysis, and geographic principles. Proficient in Python, R, and SQL, Forrest is passionate about analyzing and enriching data. With experience in geospatial data systems, he applies machine learning and data-driven decision-making to deliver insightful solutions.
Steve DiPaola, Prof.
Advisor
Steve is a renowned expert in AI/ML and computational modeling and a member to the Royal Society of Canada (highest academic honour). He has been teaching at SFU, Stanford, and NYIT, has over 100 computer scientific publications, patents, and books, and held senior positions at Electronic Arts, Saatchi Innovation, and has consulted for HP, Microsoft, Adobe, and the Institute for the Future.
Timeline
2019
Years of experimentation around personal finance data, web scraping, open government & research data analytics, training AI/ML models on public datasets, and using data in smart contracts built up and ever-increasing frustration around poor discoverability, quality, and debilitating data sharing practices. This lead to a "greenfield" exercise - imagining how modern enterprise data technologies could be adapted to solve the unique challenges when data needs to be moved between organizational boundaries.
Kamu's first distinctive features take shape from a mix of stream processing, event sourcing, bitemporal data modeling, version control and blockchain technologies.
2020
First functional prototype of kamu-cli is developed using Scala and Apache Spark.
Open Data Fabric is extracted as an open, extensible, community-governed protocol.
The kamu-cli is re-written in Rust, moving it from a prototype stage to a steady long-term development.
Open Data Fabric protocol is announced to the community and a whitepaper is published.
2021
Kamu forms an "Open Data for Data Science" consortium with Dell, Intel, and University of Groningen to improve the state of governmental and research data management and exchange in Europe.
Founders start working full-time!
Over 60 interviews of researchers, data scientists, developers of leading data platforms, and engineers from all sectors of economy take place to determine market fit.
ODF community starts growing following several conference talks and posts on social media.
2022
Development of ODF's decentralized data processing components and web platform begins.
Kamu receives first funding from Revere fund and expands the team.
Kamu graduates from Faber-Filecoin Web3 accelerator and receives funding from Faber and Protocol Labs.
Kamu is invited to join Protocol Labs “Compute Over Data” working group dedicated to decentralized processing of Web3 data.
Kamu is accepted to Creative Destruction Lab program.