The Dimensions Of An Effective Data Science Team

Organisations worldwide are increasingly looking to data science teams to provide business insight, understand customer behaviour and drive new product development. The broad field of Artificial Intelligence (AI) including Machine Learning (ML) and Deep Learning (DL) is exploding both in terms of academic research and business implementation. Some of the world’s biggest companies including Google, Facebook, Uber, Airbnb, and Goldman Sachs derive much of their value from data science effectiveness. These companies use data in very creative ways and are able to generate massive amounts of competitive advantage and business insight through the effective use of data.

https://static1.squarespace.com/static/5193ac7de4b0f3c8853ae813/5194e45be4b0dc6d4010952e/55ba8a68e4b0aac11e3339cd/1438288490143//img.jpg

The Need for Data Science

Organisations worldwide are increasingly looking to data science teams to provide business insight, understand customer behaviour and drive new product development. The broad field of Artificial Intelligence (AI) including Machine Learning (ML) and Deep Learning (DL) is exploding both in terms of academic research and business implementation. Some of the world’s biggest companies including Google, Facebook, Uber, Airbnb, and Goldman Sachs derive much of their value from data science effectiveness. These companies use data in very creative ways and are able to generate massive amounts of competitive advantage and business insight through the effective use of data.

Have you ever wondered how Google Maps predicts traffic? How does Facebook know your preferences so accurately? Why would Google give a platform as powerful as Gmail away for free? Having data and a great idea is a start – but the likes of Facebook’s and Google’s have figured out that a key step in the creation of amazing data products (and the resultant business value generation) is the formation of highly effective, aligned and organisationally-supported data science teams.

Effective Data Science Teams

How exactly have these leading data companies of the world established effective data science teams? What skills are required and what technologies have they employed? What processes do they have in place to enable effective data science? What cultures, behaviours and habits have been embraced by their staff and how have they set up their data science teams for success? The focus of this blog is to better understand at a high level what makes up an effective data science team and to discuss some practical steps to consider. This blog also poses several open-ended questions worth thinking about. Later blogs in this series will go into more detail in each of the dimensions discussed below.

Drew Harry, Director of Science at Twitch wrote an excellent article titled “Highly Effective Data Science teams”. He states that “Great data science work is built on a hierarchy of basic needs: powerful data infrastructure that is well maintained, protection from ad-hoc distractions, high-quality data, strong team research processes, and access to open-minded decision-makers with high leverage problems to solve” [1].

In my opinion, this definition accurately describes the various dimensions that are necessary for data science teams to be effective. As such, I would like to attempt to decompose this quote further and try to understand it in more detail.

Drew Harry’s Hierarchy of Basic Data Science Needs

Great data science requires powerful data infrastructure

A common pitfall of data science teams is that they are sometimes forced either through lack of resources or through lack of understanding of the role of data scientists, to do time-intensive data wrangling activities (sourcing, cleaning, preparing data). Additionally, data scientists are often asked to complete ad-hoc requests and build business intelligence reports. These tasks should ideally be removed from the responsibilities of a data science team to allow them to focus on their core capabilities: that is utilising their mathematical and statistical abilities to solve challenging business problems and find interesting patterns in data rather than expending their efforts on housekeeping work. To do this, ideally data scientists should be supported by a dedicated team of data engineers. Data engineers typically build robust data infrastructures and architectures, implement tools to assist with data acquisition, data modeling, ETL, data architecture etc.

https://sg-dae.kxcdn.com/blog/wp-content/uploads/2014/01/managerial-skills-hallmarks-great-leaders.jpg

An example of this is at Facebook, a world leader in data engineering. Just imagine for a second the technical challenges inherent in providing over one billion people a personalised homepage full of various posts, photos and videos on a near-real time basis. To do this, Facebook runs one of the world’s largest data warehouses storing over 300 petabytes of data [2] and employs a range of powerful and sophisticated data processing techniques and tools [3]. This data engineering capability enables thousands of Facebook employees to effectively use their data to focus on value enhancing activities for the company without worrying about the nuts and bolts of how the data got there.

I realise that we are not all blessed with the resources and data talent inherent in Silicon Valley firms such as Facebook. Our data landscapes are often siloed and our IT support teams where data engineers traditionally reside mainly focus on keeping the lights on and putting out fires. But this model has to change – set up your data science teams to have the best chance of success. Co-opt a data engineer onto the data science team. If this is not possible due to resource constraints then at least provide your data scientists with the tools to easily create ETL code and rapidly spin up bespoke data warehouses thus enabling them with rapid experimentation execution capability. Whatever you do, don’t let them be bogged down in operational data sludge.

Great data science requires easily accessible, high-quality data

https://gcn.com/~/media/GIG/GCN/Redesign/Articles/2015/May/datascience.png

Data should be trusted, and be of a high quality. Additionally, there should be enough data available to allow data scientists to be able to execute experiments. Data should be easily accessible, and the team should have processing power capable of running complex code in reasonable time frames. Data scientists should, within legal boundaries, have easy, autonomous, access to data. Data science teams should not be precluded from the use of data on production systems and mechanisms need to be put in place to allow for this rather than being banned from use just because “hey – this is production – don’t you dare touch!”

In order to support their army of business users and data scientists, eBay, one of the world’s largest auction and shopping sites, has successfully implemented a data analytics sandbox environment separate from the company’s production systems. eBay allows employees that want to analyse and explore data to create large virtual data marts inside their data warehouse. These sandboxes are walled off areas that offer a safe environment for data scientists to experiment with both internal data from the organisation as well as providing them with the ability to ingest other types of external data sources.

I would encourage you to explore the creation of such environments in your own organisations in order to provide your data science teams with easily accessible, high quality data that does not threaten production systems. It must be noted that to support this kind of environment, your data architecture must allow for the integration of all of the organisation’s (and other external) data – both structured and unstructured. As an example, eBay has an integrated data architecture that comprises of an enterprise data warehouse that stores transactional data, a separate Teradata deep storage data base which stores semi-structured data as well as a Hadoop implementation for unstructured data [4]. Other organisations are creating “data lakes” that allow raw, structured and unstructured data to be stored in a vast, low-cost data stores. The point is that the creation of such integrated data environments goes hand in hand with providing your data science team with analytics sandbox environments. As an aside, all the efforts going into your data management and data compliance projects will also greatly assist in this regard.

Great data science requires access to open-minded decision-makers with high leverage problems to solve

https://www.illoz.com/group_articles_images/3248184859.jpg

DJ Patel stated that “A data-driven organisation acquires, processes, and leverages data in a timely fashion to create efficiencies, iterate on and develop new products, and navigate the competitive landscape” [5]. This culture of being data-driven needs to be driven from the top down. As an example, Airbnb promotes a data-driven culture and uses data as a vital input in their decision-making process [6]. They use analytics in their everyday operations, conduct experiments to test various hypotheses, and build statistical models to generate business insights to great success.

Data science initiatives should always be supported by top-level organisational decision-makers. These leaders must be able to articulate the value that data science has brought to their business [1]. Wherever possible, co-create analytics solutions with your key business stakeholders.  Make them your product owners and provide feedback on insights to them on a regular basis. This will help keep the business context front of mind and allows them to experience the power and value of data science directly. Organisational decision-makers will also have the deepest understanding of company strategy and performance and can thus direct data science efforts to problems with the highest business impact.

Great data science requires strong team research processes

Data science teams should have strong operational research capabilities and robust internal processes. This will enable the team to be able to execute controlled experiments with high levels of confidence in their results. Effective internal processes can assist in promoting a culture of being able to fail fast, fail quickly and provide valuable feedback into the business experiment/data science loop. Google and Facebook have mastered this in their ability to amongst other things; aggregate vast quantities of anonymised data, conduct rapid experiments and share these insights internally with their partners thus generating substantial revenues in the process.

Think of this as employing robust software engineering principles to your data science practice. Ensure that your documentation is up to date and of a high standard. Ensure that there is a process for code review, and that you are able to correctly interpret the results that you are seeing in the data. Test the impact of this analysis with your key stakeholders. As Drew Harry states, “controlled experimentation is the most critical tool in data science’s arsenal and a team that doesn’t make regular use of it is doing something wrong” [1].

In Closing

This blog is based on a decomposition of Drew Harry’s definition of what enables great data science teams. It provides a few examples of companies doing this well and some practical steps and open-ended questions to consider.

To summarise: A well-balanced and effective data science team requires a data engineering team to support them from a data infrastructure and architecture perspective. They require large amounts of data that is accurate and trusted. They require data to be easily accessible and need some level of autonomy in accessing data. Top level decision makers need to buy into the value of data science and have an open mind when analysing the results of data science experiments. These leaders also need to be promoting a data-driven culture and provide the data science team with challenging and valuable business problems. Data science teams also need to keep their house clean and have adequate internal processes to execute accurate and effective experiments which will allow them to fail and learn quickly and ultimately become trusted business advisors.

Some Final Questions Worth Considering and Next Steps

In writing this, some intriguing questions come to mind: Surely there is an African context to consider here? What are we doing well on the African continent and how can we start becoming exporters of effective data science practices and talent. Other questions that come to mind include: To what end does all of the above need to be in place at once? What is the right mix of data scientists/engineers and analysts? What is the optimal mix of permanent, contractor and crowd-sourced resources (e.g. Kaggle-like initiatives [7])? Academia, consultancies and research houses are beating the drum of how important it is to be data-driven, but to what extent is this always necessary? Are there some problems that shouldn’t be using data as an input? Should we be purchasing external data to augment the internal data that we have, and if so, what data should we be purchasing? One of our competitors recently launched an advertising campaign explicitly stating that their customers are “more than just data” so does this imply that some sort of “data fatigue” is setting in for our clients?

My next blog will explore in more detail, the ideal skillsets required in a data engineering team and how data engineering can be practically implemented in an organisation’s data science strategy. I will also attempt to tackle some of the pertinent open-ended questions mentioned above.

The dimensions discussed in this blog are by no means exhaustive, and there are certainly more questions than answers at this stage. I would love to see your comments on how you may have seen data science being implemented effectively in your organisations or some vexing questions that you would like to discuss.

References

[1] https://medium.com/mit-media-lab/highly-effective-data-science-teams-e90bb13bb709

[2] https://blog.keen.io/architecture-of-giants-data-stacks-at-facebook-netflix-airbnb-and-pinterest-9b7cd881af54

[3] https://www.wired.com/2013/02/facebook-data-team/

[4] http://searchbusinessanalytics.techtarget.com/feature/Data-sandboxes-help-analysts-dig-deep-into-corporate-info

[5] https://books.google.co.za/books?id=wZHe0t4ZgWoC&printsec=frontcover#v=onepage&q&f=false

[6] https://medium.com/airbnb-engineering/data-infrastructure-at-airbnb-8adfb34f169c?s=keen-io

[7] https://www.kaggle.com/

by Nicholas Simigiannis

The Power of the Unconversation

On the 9th of March 2017 twelve enthusiastic Foundery members attended DevConf 2017, South Africa’s biggest community driven software development conference: an event that promised learning, inspiration and networking.

Courtesy of DevConf 2017 (devconf.co.za)

On the 9th of March 2017 twelve enthusiastic Foundery members attended DevConf 2017, South Africa’s biggest community driven software development conference: an event that promised learning, inspiration and networking.

With a multi-tracked event such as this one there is usually something for everyone, and yet if you speak to serial conference attendees (guilty as charged), the talks aren’t the greatest reason to attend.

People like me go to conferences in part for the scheduled content, but mostly for the unscheduled conversations in the passage en route to a talk or around a cocktail table during a break. The “unconversations”, I’m calling them. It’s the conference equivalent of another well-known creative outlet: “water cooler conversations”.

I’ll admit that I’m a bit of a conference butterfly – actively seeking out these “unconversations” so that I can join them. I especially take note as crowds disappear into conference rooms. I’m drawn to the groups of people who stay behind wherever they might have gathered. That’s where I’m almost guaranteed to participate in really interesting discussions and learn something new. When I attend conferences, it’s this organic and informal style of collaborative enquiry I look forward to the most.

Courtesy of DevConf 2017 (devconf.co.za)

Ironically it was one of the DevConf talks that helped me understand why these “unconversations” tend to work so well as creative spaces. In his talk on Mob Programming, Mark Pearl mentioned a study conducted by the American Psychological Association which established that groups of 3-5 people perform better on complex problem solving than the smartest person in the group could perform on their own. See “references” for more information.

Loosely translated, a group of people has a better shot of solving a complex problem together than if they tried to solve it independently.

As a Mob Programming enthusiast myself, this makes complete sense to me. What’s interesting is that this research is not new, yet many organisations still discourage “expensive” group-work and continue to reward individual performance, and I can see why. For people with similar upbringings and educational backgrounds to mine, this is the comfort zone. We default to working alone and feel a sense of accomplishment when we achieve success individually. As children we were told to solve problems and find answers on our own. Receiving help was a sign of weakness, and copying was forbidden.

In contrast, the disruptive organisations of the last few decades encourage the complete opposite. These organisations recognise the value of problem-solving with groups of people who have varying, and even conflicting, perspectives. There’s no time for old-school mindsets that favour individual efforts over collaboration. We need to cheat where it’s appropriate by knowing who can help us and what existing ideas we can leverage.

I don’t mean to trivialise it. There’s a bit more involved than just creating opportunities for people to solve problems in groups. According to the book “Collective Genius”, innovative companies such as Google have developed three important organisational capabilities: creative abrasion (idea generation by encouraging conflict and high quality feedback), creative agility (hypothesizing, experimenting, learning and adapting) and creative resolution (deciding on a solution after taking new knowledge into account) all supported by a unique style of leadership. The case studies are incredibly motivating.

Since joining the Foundery I’m discovering that we are practicing these things every day, and the amazing ideas and products born from our “collective genius” serve as confirmation that we’re on the right track. Is it always easy? No, absolutely not. It’s requires a great deal of mindfulness.

When I’m reflective I notice that the greatest ideas and most creative solutions I’ve brought to life were conceived with input from others. Many of the dots I connected for the first time happened during completely unlikely meetings of minds, and some through passionate differences of opinion. In an environment that calls for constant collaboration, it’s wonderfully refreshing to find that the “unconversations” I enjoy so much are happening all around me, every day.

And so long as I’m participating, I am always reminded that together we are more capable of solving really complex problems than the smartest one among us, and I’m becoming more and more OK with that.

References:

By Candice Mesk

 

The Doosra

Working in an investment bank over the past decade has provided the opportunity for many interesting conversations around what the value to society of an investment bank represents. Often the model of a “zero sum game” is proposed which suggests that finance often doesn’t add much – in terms of the transactions that banks facilitate, someone is a winner and someone else is the loser, there is no net gain to the world. Other purists would argue something along the lines of efficient allocation of resources. That initially sounded a bit too creative for my more linear reasoning, but after years in the trenches, it has developed an intuitive ring of truth to it.

Working in an investment bank over the past decade has provided the opportunity for many interesting conversations around what the value to society of an investment bank represents. Often the model of a “zero sum game” is proposed which suggests that finance often doesn’t add much – in terms of the transactions that banks facilitate, someone is a winner and someone else is the loser, there is no net gain to the world. Other purists would argue something along the lines of efficient allocation of resources. That initially sounded a bit too creative for my more linear reasoning, but after years in the trenches, it has developed an intuitive ring of truth to it.

Similarly, digital disruption suffers a questionable motive. For some enterprises, such as Uber, it may appear that the shiny plaything of some young geeks on the west coast of america has been allowed to plough through the livelihoods of real people with real jobs and families around the world. When applying such thinking to digital disruption in the realm of investment banking, the question arises as to whether there is any real value that this rather obscure digital offspring of an already often questioned enterprise can produce.

At times this line of thinking led me to check my own passion for this “new vector of commerce”. How do I ensure that my natural fascination with some “new and shiny” geek toy is not diverting what should be a cold, objective application of technology to investment banking, rather than being an excuse to pursue disruption for its own sake. How do we ensure a golden thread of validity and meaning to this exercise.

I started thinking about Google, and how I could justify what value they might have brought to the world (and not just their shareholders). I won’t pretend that I spent much time on this question, but I did come to the following example. Google maps is a fantastic application, and I probably initially loved it more for the fact that in this we have an application that is bringing the real world (travel, maps, my phone, my car) together with the digital world (the internet, GPS technology, cloud based algorithms).

However, it is a tool that many people use, and its value extends beyond that initial fascination. I have considered that in a very real way there are likely to be hundreds of millions of people that might use google maps every day to guide them on an optimal route in their cars. And, true to form, it manages to do this: either by advising detours around potential traffic jams, or by merely showing quicker routes that save time.

That extra time in traffic that has been avoided represents a very real saving in carbon emissions into the atmosphere, and real energy that would have been wasted pumping cylinders up and down in an idling vehicle. This is not a zero sum equation where google benefits and many small companies lose out. This is a very real benefit to the world where increased efficiency reduces the amount of wasted energy, and wasted time of humans. This is a net positive game to the world. In some respect the world of humans win, and the domain of entropy loses – if we are forced to put a name to it.

Personally I would feel deeply gratified if I could produce such a result that created a new benefit to either the world, or at the very least some small piece of it.

Interestingly enough, this speaks to an underlying theme which appeals to many people that are attracted to incubators of disruption, such as the Foundery. Many people do really feel that they would like to be part of something that changes the world. Perhaps this is because such incubators invoke the perceived “spirit” of Google, Facebook and other silicon valley heroes as an inspirational rally cry. I believe that the example of google maps does show that the present opportunity of disruptive technology can represent a possibility for such very real efficiencies and benefits to be created. Perhaps those seemingly naive passions that are stirred in the incubatees are valid, and should be released to find their form in the world.

So how do we harness this latent energy? Where do we direct it for the best chance of success?

Some of the technologies to be harnessed, and which represent the opportunity of disruptive technology:

  1. IoT (the internet of things):

At its most simple, this means that various electronic components have become sufficiently small, powerful and most importantly, cheap. It can become possible and economically viable to monitor the temperature, humidity, soil hydration of every single plant in a field of a farm. To measure the status of every machine on a production line in a small factory in the east rand, without bankrupting the owner with implementation costs.

Apart from sensors, there are actuators in the world such as smart locks, smart lights and the smart home which enable real-world actions to be driven and controlled from the internet. Together these provide the mechanism for the real world to be accessible to the digital world.

This extends beyond the “real“ real world: there are changes at play, not too far under the surface of the modern financial system, that are turning the real world of financial “things” (shares, bonds, financial contracts) into the internet world of financial “things” (dematerialised and digitised shares, bonds online, financial contracts online).

There are also actuators in this world, such as electronic trading venues and platforms which enable manipulation of digital financial contracts by digital actors of finance.

  1. Data is free:

The cost per megabyte of storage continues to drop exponentially, and online providers are able to offer services on a rental basis that would have been inconceivable a decade ago. The ubiquity of cheap and fast bandwidth enables this even more so.

  1. Computation is cheaper than ever, and simple to locate with cloud based infrastructure:

Moore’s law continues unabated, providing computational power that drops in cost by the day. Notwithstanding the promise of quantum computing which seems around the corner

  1. The technologies to utilize are powerful, free and easy to learn:

If you have not yet done so, have a sojourn on the internet across such topics as python, tensorflow, quandl, airflow and github. These represent free, open-source (largely) capabilities to harness the technologies above and make them your plaything. Not only that, the amount of free resources “out there” which can help you master each of these is astounding.

A brief exercise into trying to automate my house using python has revealed hundreds of youtube videos of similarly obsessed crazies presenting fantastic applications of python to automating everything from their garage doors, fishtanks, pool chlorine management systems, alarms etc. These youtube videos are short, to the point, educational, free and most importantly crowd moderated – all the other python home automation geeks have ensured that all the very good videos are upvoted and easily found; and the least fit are doomed to obscurity.

This represents another perhaps unforeseen benefit of the internet which is crowd-sourced, crowd-moderated, efficient and specific education. JIT learning (“just in time learning”) which means being able to learn everything that you need to accomplish a task five minutes before you need to solve it, and perhaps to forget everything almost immediately once you have solved it…. (That is an interesting paradigm to counter traditional education).

( P.S. if you have kids, or want to learn other stuff, checkout https://www.khanacademy.org/ )

Given the above points, it has never been easier for someone to create a capability to source information in real time from the real world, store that information online, apply unheard of computing power to that information using new, powerful and easy programming languages which can be learned online in a short period of time.

It might be a moot point that is valid at every point in time in every generation, but it has never been easier and cheaper to try out an idea online and see if it has legs.

So we have identified people with passion, a means of delivery and so now … what?

Those of you that are paying attention would realise that I have skirted the question of whether we have added any real value to the world, or feel that we can? Time will tell, and I would hate to let the cat out of the bag too early. But there is one thing that is true: if you are one of those misguided, geek-friendly, meaning-seeking, after hours change agents, or if you have an idea that could change the world, come and talk to us … the door is always open.

by Glenn Brickhill

Coding DevSecOps

Typically, IT policies cover many aspects of the technology landscape – including security. These policies are written in elaborate documents and then are stored somewhere cryptic. Finding these policies is very often a challenge.

The Enterprise Problem

Typically, IT policies cover many aspects of the technology landscape – including security. These policies are written in elaborate documents and then are stored somewhere cryptic. Finding these policies is very often a challenge.

Then we hire experts to come in and manually run penetration tests against the environment which gives us measurement feedback on both compliance and vulnerabilities.

We then take these manually generated penetration test reports and ask people in our organizations to take the results and remediate according to the test findings.

While all this is happening we make policy changes to current policies, we also bring in new software into the environment to satisfy new business requirements. At the same time new threats appear.

Each part of the process can span months, meaning this whole cycle may take multiple months.

We need to close the window, and change the time reference from months, to days to hours and ideally, real-time to match the timeframes of potential hackers. This is DevSecOps and I’m going to tell you how to do it.

The Road to DevSecOps

We’ve framed the enterprise problem, now how do we apply DevSecOps to it? Well the answer is to delve a little more into DevSecOps.

DevSecOps = DevOps + Security ( Sec )

In the world of DevSecOps as you may predict we have three teams working together. Development, the Security team and Operations.

The “Sec” of DevSecOps introduces process changes to the following elements of an organization:

  • Engineering
  • Operations
  • Data Science
  • Compliance

This may seem a little daunting, let’s unpack these changes.

Engineering & Operations

Engineering refers to how you build with security in mind and bring security into your engineering pipeline. A typical engineering pipeline shown below:

As we practically observe code eating the world the engineering pipeline for the development, security & operations build team looks very similar. Coding best practices apply to all. Everyone needs to change the way that they think. We are no longer working in silos but rather working together in a well co-ordinated and harmonious manner.

Development team

  • Writes the system code with security in mind
  • There are changes to the engineering pipeline (policies & practices), most notably static code analysis using SonarQube that looks for security vulnerabilities

Operations team

  • Writes Puppet code to manage infrastructure state to application layer as well as comply against OpenSCAP policies
  • Static code analysis by means of PuppetLint

Security team

  • Experiments, automates and tests new security approaches and creates Puppet modules

Security operations

  • Continues to detect, hunt and then contain threats
  • Writes OpenSCAP policies that aligns to IT policy

Some examples of the parallels:

>> Static Code Analysis: Talk about SonarQube for Developers and PuppetLint for Operations

>> Automated unit test: Talk about Beaker for Operations and xUnit for Development

With this convergence there is no reason not to predict that the security team will soon follow similar practices when the toolsets reach the right levels of maturity.

 Data Science & Compliance

Once you start collecting data you can apply reverse looking analytics and forward looking data science approaches. The data collected from DevSecOps can be used to augment already well

established security data. In particular Puppet by its nature enforces a specific state, if this state changes without sanction these events can be used as ‘trip wires’ to detect potential intruders.

Measurement gives you compliance measurement feedback against your policies. This occurs at the cadence that you configure Puppet which defaults to 30 minutes.

To Conclude

In the new world – instead of having IT policies as documents, we codify them. Sec writes the policies and then Dev & Ops work together to write the remediation code.

Measurement moves from a manual state to an automated state. We write the policy code in OpenSCAP, remediate the policy breaches with Puppet code we have written. Threats, environment change and policy change occur as we expect.

The difference in the DevSecOps world is that we update and the policy & remediation code in minutes, we then rollout to the organization in hours.

That’s the true power of DevSecOps!

DevSecOps talks to how the principles of DevOps can be applied to the broader security context.

The path to building a culture of security in an organization needs to follow a similar path to that of DevOps: set the right expectations of outcome then empower and measure.

In this world of security challenges, can you afford not to do DevSecOps?

The Future

Look out for my next post on how to apply DevSecOps in a containerized environment!

by Jason Suttie

The essence of a FinTech team

Along my short career I find myself wondering what the keys to success are. I have come to the realization that though the media will tell us stories of successful individuals, few key inventions were conceptualized and industrialized by just one person. So what makes a successful team and how would you put one together?

Along my short career I find myself wondering what the keys to success are. I have come to the realization that though the media will tell us stories of successful individuals, few key inventions were conceptualized and industrialized by just one person. So what makes a successful team and how would you put one together?

The idealist within me wishes that I could provide a recipe for the ideal FinTech team. I would like to be able to say in order to revolutionize the world you need 5 analysts, 10 developers and 17 Data scientists but this still wouldn’t guarantee success. So what is the essence of a Fintech team? I may not have all the answers but I do think there are some common elements in truly successful teams.

Purpose

The word purpose is over used but misunderstood. The true meaning of the word took on a new meaning when described by Viktor Frankl in his 1946 classic, “Man’s Search for Meaning” within the context of a World War 2 prisoner camp. Viktor was a neurologist and psychiatrist who was captured and lived in a prisoner of war camp. He shares his observation on the elements of motivation and depression that he observed in his fellow prisoners.  Personally I think Viktor does a better job at explaining it than I could.

Viktor explains that the reason people survived the Holocaust is they had something else to live for, a true purpose. Sometimes this was as simple as a desire to see their family again, in other cases it was more complex. It is this motivation by purpose is that I believe galvanizes a team.

Salim Ismail insists that all start-ups set a multi transformational purpose. These purpose statements need so be short and to the point so that there is no room for misinterpretation. If your purpose cannot be stated in one sentence, then it has not been distilled into its essence. This helps focus all team members at the same goal. Most importantly it means that all team members should believe in the purpose. Getting this right is almost impossible but I would be willing to bet that successful teams have gotten this right. My memory takes me back to South Africa’s 1995 Rugby World Cup winning team who went through the entire tournament with the purpose statement of “one team, one nation.” A purpose that resonates so strongly in all individuals within the team makes it impossible to fail.

http://www.sport24.co.za/Rugby/Springbok-Heritage/1995-RWC-squad-honoured-for-greatest-day-in-SA-rugby-history-20150624

People

I was in awe of these start-up stories outlining how a group of people started a multi-billion-dollar company in their garage.  In the past few years I found myself in the proverbial garage of multiple different acquaintances and friends, it was only then that I realized what was driving this behavior. I found myself drawn to this group merely because we were enjoying the hard work and the time we were spending with each other. It is easier to accomplish a complicated and long goal when you have good people around you that you connect with. I’m not at all saying that you need to be best friends with all your team members but I do believe that you need to find some commonality to have a human connection.

What about skills?

I’m am by no means diminishing the need for skilled people in your team. I am however making an assertion that even if you have the best skills, without a purpose and connected team you are doomed to fail. Pay more attention to the qualitative things when setting up the team. The things we take for granted like the feeling when you walk through the office doors, the vibe in the room, the “nice to have” social interactions.

So I guess my recipe is this:

Find a purpose that resonates with you. Then find a group of people that you can connect with. If the purpose resonates with your team, I believe you have a good chance of success.

by Tyrone Naidoo

Link to video: https://www.youtube.com/watch?v=fD1512_XJEw

The Evolution of Money – Part 2

The emergence of crypto instruments has allowed digital monetary value to be held without the need for a trusted intermediary for the first time in history. Crypto currencies – the most common type of crypto instrument today – such as Bitcoin, Ether, and many others demonstrate this fact.

(This is a continuation of my earlier blog: The Evolution of Money – Part 1)

The emergence of crypto instruments has allowed digital monetary value to be held without the need for a trusted intermediary for the first time in history. Crypto currencies – the most common type of crypto instrument today – such as Bitcoin, Ether, and many others demonstrate this fact. Anyone who owns crypto currency has a unique private key (akin to a password) that allows the owner (and only the owner) to mathematically “unlock” or spend the value at an associated public address (akin to an account number). The ledger that records the amount of crypto currency at any particular public address is maintained by a network of computers (called nodes) that run a consensus algorithm to ensure that they are all synchronised.

The ingenuity of this network of nodes working together to validate transactions (and reject any double-spending) is that the multitude of nodes ensures that there is no dependence on any single one to ensure the integrity of the ledger. This is a powerful concept – the way to remove a trusted intermediary is not to get rid of the intermediary, but to increase the number of intermediaries that are maintaining the same ledger so that trust in any particular intermediary is no longer needed. Dependence on any single intermediary reduces as the network grows, and indeed the term “intermediary” starts to become inappropriate and even inaccurate.

To understand the financial world’s fascination with crypto currency and blockchain, we have to examine the nature of money. The traditional textbook definition of money refers to its three major functions in society: a means of exchange; a unit of account; and a store of value. Interrogating this further, however, it becomes apparent that all three functions relate to the concept of money representing value. After all, why accept money as a means of exchange for something else of value unless one believes the money has at least the value of what it was traded for? And a unit of account that measures the value of other products needs to possess value itself, otherwise it would be abandoned as a unit of account (as we have seen in any economy that has witnessed hyperinflation). So if the functions of money boil down to the value it possesses, what determines the value of money?

Six characteristics determine how effectively money performs the functions mentioned above and in turn determine its value. These characteristics are:

  1. Durability – if money is meant to store value and does not last long itself, it cannot function as a very good store of value;
  2. Portability – to facilitate trade, money needs to be very portable, and costs associated with transferring it from one party to another diminish its function as money;
  3. Fungibility – a unit of money should be exactly the same as any other unit, otherwise time and energy would be consumed in comparing tokens rather than promoting trade;
  4. Divisibility – the smallest unit of money must be worth less than every other tradable asset, otherwise another token of money would need to be used to trade something worth less than the smallest unit of money;
  5. Scarcity – the oversupply of any commodity brings its value down, and in the extreme case, where something is overly abundant, it cannot be used to trade for other scarce resources; and
  6. Acceptability – money is accepted because the recipient believes it will be accepted by others when he/she wants to spend it. Without the belief that money will be accepted by others in the future, it would cease to be money.

Crypto currency is a better performing form of money (versus physical cash and digital money) in two significant ways: (1) It is more durable (it’s backed up by many more servers across institutions than traditional digital money that is backed up only by the servers of an individual bank); and (2) it is more portable (as it is more seamless to move money on a single decentralised ledger than across different centralised ledgers). Money is supposed to be the most frictionless asset in society, and crypto currency is the most frictionless form of money to date.

The Evolution of Money – Part 3 coming soon!

by Farzam Ehsani

 

 

 

 

 

 

 

Design Indaba made me do it –

This was the mantra for the 22nd annual Design Indaba conference, hosted by the beautiful city of Cape Town at the Artscape theater.

This was the mantra for the 22nd annual Design Indaba conference, hosted by the beautiful city of Cape Town at the Artscape theater.

The Design Indaba Conference has grown to become one of the world’s leading design events and hosts more than 40 speakers and 2 500 delegates. It draws creatives from all spheres and industries to come together under one roof to share knowledge, inspire and to collaborate with one another.

We talked, mingled and networked; filing our inspiration tanks. There were graffiti artists, dj’s, musicians, sculptors and various sponsor pop-ups and activation units, inviting us into this world of endless possibility and creativity.

Contrary to current perception, Design Indaba is not a conference ONLY for creatives – it is for everyone, from any field of expertise that would like to ignite their senses and intrigue their minds. It’s a jam packed 3 days and I believe that there is something that will speak to anyone’s core. This year was my first Design Indaba and it was a truly immersive experience, exceeding all my expectations.

The main highlight for me, wasn’t the skill or talent of all these amazing people (even though that was incredible) – but rather their thinking, this really stood out to me; they took us on a journey through the lens and into their magical minds!

Ultimately, Design Indaba wants to change the thinking of the world, one conference at a time, one creative at a time, and one business at a time.

It will take a generation of creative thinkers and implementers to see a turnaround. Design Indaba’s primary aim therefore is “to advance the cause of design as a communication fundamental, a business imperative and a powerful tool in industry and commerce, awakening and driving a demand for investment in intellectual capital”.

Investing nearly two decades in this vision, Design Indaba has championed the creative revolution. Here are some of my highlights from the 3-day event (content supplied from the Design Indaba weekly mailer):

The enchanted forest – Can beauty redeem us?

We were welcomed into the Design Indaba Festival 2017 through an enchanted forest of massive tree sculptors that were beautiful and surreal.

These tree sculptures were on exhibition the entire conference and created a magical ambience to the atmosphere in the festival court yard. I felt like I was walking around in a world that was a mash-up of the movies, Labyrinth and Alice in Wonderland (Tim Burton version).

Read more >

Capturing Cape Town’s scent with Kaja Solgaard Dahl

The thank-you gift for the festival this year was created by this designer, Kaja Dahl, she is fascinated with creativity that uplifts our experience and affect the senses directly.

Her process and the end-product is captivating and just incredible. She truly did capture the scent of Cape Town –whimsical, fresh, enlighten, yet eccentric.

Read more >

Masters in the art of freestyling it

One of my main highlights of the festival was the amazing group called Freestyle Love Supreme. They would wrap up each day with freestyle rap and beat boxing. They were so entertaining and funny, I laughed so hard that may face hurt.

The Design Indaba team chatted to Freestyle Love Supreme ahead of their Design Indaba daily wrap ups and once-off performance on the Thursday at Nightscape.

Read more >

 

 

Swahili launches on Duolingo

At Design Indaba 2017, Luis Von Ahn launches the first African language course on Duolingo. The audience went wild when he told us, he then went on to say that the second African language they will be launching will be Zulu. We can’t wait to see more African languages on this amazing app.

Read more >

Arch For Arch: A coda for Design Indaba Festival Day 3

The spectacular finale of the 2017 Conference and a tribute to Archbishop Desmond Tutu. It was a great honor and privilege for me to be a part of this amazing ceremony and to hear the incredible and humble, Archbishop Desmond Tutu talk. It was a great way to end the amazing festival, I left feeling inspired

Read more >

Thank you for the wonderful experience and we are looking forward to where they go from here.

So, if you think that design indaba isn’t for you – think again. Book your ticket for next year and immerse yourself.

by Mari-Liza Monteiro

 

 

 

 

The changing world around programmers

In today’s ever-changing world, we find that businesses have become more concerned about what you can do rather than what qualification you have.

Gabriel blogIn today’s ever-changing world, we find that businesses have become more concerned about what you can do rather than what qualification you have. This paradigm is becoming more apparent as companies have an unbelievable shortage of decent coders who are able to deliver to their expectations. This gap in the employment market is increasing as the average university turnout of BSc Computer Science graduates is far less than actual demand.

 This situation has led the industry to change the way they look at qualifications and to focus more on a person’s ability to code and learn. If you are a self-taught coder and have an understanding of industry-relevant technology, you are in a much better position than someone who still has to go into university and learn coding there for the first time. A few companies are willing to take the risk of hiring someone without formal coding qualifications, and have reaped the rewards in taking those risks. The coders that they hire generally seem to be more aware of what new technology is available, and are more willing to learn something new in order to help them grow further.

 We are starting to see a paradigm shift in the industry and the way in which people think. The stack overflow statistics show that the proportion of self-taught developers increased from 41.8% in 2015 to 69.1% in 2016. This shows that a lot of developers are self-taught and a lot more people are teaching themselves how to code each year. People who start to code from a young age show such passion for coding and in combination with their curiosity for learning something new, their love for it speaks volumes. To have the ability to create anything that they can think of on a PC, and to manipulate a PC to behave like they want it to and have a visual representation of this, is unbelievable.

 For those interested in teaching themselves how to code there are many websites to look at. Here is a list of 10 places you can learn coding from, but I will list the top 3 places that I learnt the most from:

Those websites have their own way of teaching code and if youcombine this with some Youtube videos from CS50 and MIT OpenCourseWare you will be all set to learn at your own pace. Hackerrank is a good way to test everything you learnt and you can see how you rank against the world.

 WeThinkCode_ is an institution to learn coding, for anyone from ages 17-35 years old. Their thinking is that you do not need to have a formal qualification to be a world class coder. More institutes like this are opening across the world. Having a wide age gap illustrates that you are never too old to learn how to code. There are also more and more coding education opportunities for young people. It is really easy to learn how to code from a young age as that is when your mind is at its prime to learn new things and adjust to constant change.

 In a programmer’s world you are constantly learning new things and this is what makes our jobs exciting.

The world is ever-evolving and we all need to keep adjusting our mindsets on how we look at things, otherwise we will be left behind while everyone moves forward.

By Gabriel Groener

Why it took 400 years to invent the wing

You would be forgiven for thinking the three photographs below were various “Wright Flyers” piloted by renowned flight pioneers Wilbur and Orville Wright. They’re not – They are photographs of different flights that took place at roughly the same time…

THE MYSTERY OF SIMULTANEOUS INVENTION

You would be forgiven for thinking the three photographs below were various “Wright Flyers” piloted by renowned flight pioneers Wilbur and Orville Wright.

NPS.gov
NPS.gov
Wright-brothers.org
Wright-brothers.org

 

Wright-brothers.org
Wright-brothers.org

They’re not – They are photographs of different flights that took place at roughly the same time as the famous Wright brothers’ flight. Not all of these inventors knew of each other’s existence prior to their inventions. This bizarre case of “simultaneous invention” has occurred many times before, and since, the Wright brothers’ flight. The polio vaccine was developed by three separate scientists almost at once. The patent for the telephone was filed by two separate individuals on the same day.

Why does innovation occur simultaneously? We tend to have an idealised view of how scientists work. We have a picture of an individual in a workshop making a few sketches and shouting out in joy at having thought of the wing. If this was indeed the case, then the occurrence of simultaneous inventions would almost defy logic.

The reality is that the inventors and innovators captivate the views, thoughts and ideas of their day as well as existing technology, and it is this trait, that explains the phenomenon of simultaneous discovery.

The story of fixed-wing self-powered flight

The first recorded study of flight was Leonardo Da Vinci’s “Codex on the Flight of Birds” in 1505. John Smeaton was the first to attempt to quantify the phenomenon of lift prior to 1800. Using the concept of lift, George Cayley just after 1800 conceived the concept of cambered airfoil and made the world’s first glider. The glider could barely move any practical distance. Otto Lilienthal, in 1889, took experimentation to a new level. By absorbing the thoughts of his day, he made an astonishing 2500 glides and documented his findings in the famous “Lilienthal tables”. The Wright brothers could not emulate the data in the Lilienthal tables because of an error in the concept developed by Smeaton over 100 years earlier. Out of frustration, they went over and above Lilienthal’s experiments by creating the world’s first rudimentary wind tunnel. They realised that “camber”, “aspect ratios” and “angle of attack” all contributed to various lifts. And so, 398 years after its first study, Wright Wing Number 31 was selected for the historic flight. The wing, by itself, was insufficient for the flight – they had to procure the latest internal combustion engine to power the plane. Luckily for them, this had been developed in parallel, and with its own intricate history. The Wright brothers, by profession, were bicycle manufacturers, not backyard inventors or carpenters – the perfect candidates for flight pioneers. Think light weight rivets, spokes, wheel rims and tubes.

The Adjacent Possible explains simultaneous discovery

The same fascinating story can be found in numerous other inventions such as the Gutenburg press. The movable type, the press, paper and ink all have stories of the their own. And few of them were traced back to “Eureka” moments. Steven Johnson first proposed the concept of an adjacent possible which originally has its roots in microbiology. As Steven Johnson writes in the Wall Street Journal:

“the [adjacent possible] boundaries grow as you explore them. Each new combination opens up the possibility of other new combinations. Think of it as a house that magically expands with each door you open. You begin in a room with four doors, each leading to a new room that you haven’t visited yet. Once you open one of those doors and stroll into that room, three new doors appear, each leading to a brand-new room that you couldn’t have reached from your original starting point. Keep opening new doors and eventually you’ll have built a palace.”

We can therefore argue that no matter how much of a genius Da Vinci was, he could not have possibly made a flying machine back in 1505. He was not at the boundary of the adjacent possible. He most certainly contributed to it as he was part of the enlightenment and laid the early foundations of putting innovative thoughts on paper. John Smeaton’s lift equation was wrong but it was a critical contribution in that it attempted to quantify the mysterious phenomena of lift into an equation and enabled Otto Lilienthal to record his famous tables.

What can Fintech Learn about the Adjacent Possible?

The story of the wing is an extreme case study of iterative innovations towards a single invention.

Players in the Fintech space could learn from this theme. Innovation happens on the boundaries of the adjacent possible. For example, crypto currencies could not be implemented prior to the ability to hold distributed ledgers on multiple databases connected in a common consortiumEureka moments are indeed rare. Innovation initiatives should reach out to the world to absorb the thoughts and ideas of the day. Businesses should look within their own boundaries to find their own “Lilienthal tables” – to see what worked and what didn’t, in order to innovate effectively.

The Palau island tribes did, in theory, implement a blockchain in 500AD but the “ledger” was effectively narratives held by the tribes elders

by Dejan Popovic

Which race are you in?

As we hurtle head on in 2017 its becoming increasingly clear that no matter what generation you find yourself in – Xers embracing tech, Y’s passionately living the dream or Z pushing us all faster than we ever believed we could go – if you are not concentrating…this digital world will run right past before you blink.

http://europe2017.finovate.com/
http://europe2017.finovate.com/

At Finovate 2017 in London last week, I was struck firstly by the intensity of this pace – the leaps that tech has taken over the past year, but also, and more importantly, by the spirit of partnership.

No longer are we in a world where competition is about being the fastest or the smartest, we are living in a world where winning is about bundling those that are faster and smarter than you into meaningful solutions for the business you are in, and the clients that you serve.

In banking it’s too late for us to say “let’s build our own” or “let’s throw money at disruption”; we need to get our heads around connecting fintech dots to build the best solutions for our clients. In biometrics and authentication, the solutions are overwhelming, similarly in app design and integration.  Banking is less and less about paper trails and complicated products and more about integrating whole life solutions with ease of use and integrated platforms. It’s not at all about selling products and more about connecting the right client to the appropriate product they need for the time of their life that they are in – most often aided by a funkily named chatbot.  The world of social media and banking have converged already (yup ship sailed), payments is fast becoming something everyone does …everyone! We can already buy packaged analytics and information about pretty much anything we need.

Banking has morphed from functional practicality to gorgeous design, insightful user experience and lifestyle products that adjust to the needs of its customers. Tricky thing is that much of that “banking” isn’t coming from banks! So what on earth should banks be doing?

Concentrating? Yes. Trying to keep up? No. Collaborating? Absolutely!

Finovate entrepreneurs brought solutions to banking problems we never even knew existed. They challenged views of what banks do and encouraged us all to ask “how can we help you help us help our clients?”  More importantly though, they showed what collaboration brings.  Over and over as the 7 minute spots passed by, it was clear that these entrepeneurs are building on what each other are building.  Each using bits of what others had built, to supersize the solutions they were prototyping.

And that is the way to stay in the race! So as we train for the year ahead, we need to make sure we have the insight to navigate the way forward, the partnerships with fintechs to supersize our banking offerings and the deep relationships with clients to package this stream of incredible ideas in ways that makes them not only satisfied but thrilled with the way they interact with our ecosystem.

by Liesl Bebb Mckay