To Smart Source, “data science” is the capability of algorithmically deriving insight from data. Now, this is not the same as AI, or artificial intelligence – the insights and data themselves are inherently not “intelligent”. What Smart Source Data Scientists do is to obtain data-driven insights in a smarter and more systematic way.
Our approach to data science project is three-fold:
- First and foremost - is getting the data in order. This involves not only an understanding of the data the customer has and need, but also the necessary processes surrounding the data.
- Second, deploying the appropriate tools, or technological capabilities. This includes machine learning, natural language processing and network analysis algorithms, amongst others, to find out what the data is saying or could say about patterns and behaviour.
- Third, putting in place the necessary infrastructure to facilitate the two components mentioned above. This entails making decisions on using:
- Unified platforms for data collection and submission;
- Cloud technology to run data analysis;
- Reusable tools and code libraries to facilitate analysis; and
- APIs for disseminating data and allowing scalability.
We not only use the right algorithms on the right datasets with the right infrastructure, but understand what the algorithms are doing and what the results mean for decision-making. This goes beyond the science of data cleaning and understanding the math behind the algorithms; it is the art of contextualising the output of these algorithms.
Smart Source is proud to have the right mix of data scientists, data engineers and data architects to take on projects of any complexity, be it from scratch or take over an existing AI/Data Science initiative. Our engineers have expertise in R programming, Python programming and various other AI programming tools both open source and commercially available.