The Dimensions Of An Effective Data Science Team

Organisations worldwide are increasingly looking to data science teams to provide business insight, understand customer behaviour and drive new product development. The broad field of Artificial Intelligence (AI) including Machine Learning (ML) and Deep Learning (DL) is exploding both in terms of academic research and business implementation. Some of the world’s biggest companies including Google, Facebook, Uber, Airbnb, and Goldman Sachs derive much of their value from data science effectiveness. These companies use data in very creative ways and are able to generate massive amounts of competitive advantage and business insight through the effective use of data.

https://static1.squarespace.com/static/5193ac7de4b0f3c8853ae813/5194e45be4b0dc6d4010952e/55ba8a68e4b0aac11e3339cd/1438288490143//img.jpg

The Need for Data Science

Organisations worldwide are increasingly looking to data science teams to provide business insight, understand customer behaviour and drive new product development. The broad field of Artificial Intelligence (AI) including Machine Learning (ML) and Deep Learning (DL) is exploding both in terms of academic research and business implementation. Some of the world’s biggest companies including Google, Facebook, Uber, Airbnb, and Goldman Sachs derive much of their value from data science effectiveness. These companies use data in very creative ways and are able to generate massive amounts of competitive advantage and business insight through the effective use of data.

Have you ever wondered how Google Maps predicts traffic? How does Facebook know your preferences so accurately? Why would Google give a platform as powerful as Gmail away for free? Having data and a great idea is a start – but the likes of Facebook’s and Google’s have figured out that a key step in the creation of amazing data products (and the resultant business value generation) is the formation of highly effective, aligned and organisationally-supported data science teams.

Effective Data Science Teams

How exactly have these leading data companies of the world established effective data science teams? What skills are required and what technologies have they employed? What processes do they have in place to enable effective data science? What cultures, behaviours and habits have been embraced by their staff and how have they set up their data science teams for success? The focus of this blog is to better understand at a high level what makes up an effective data science team and to discuss some practical steps to consider. This blog also poses several open-ended questions worth thinking about. Later blogs in this series will go into more detail in each of the dimensions discussed below.

Drew Harry, Director of Science at Twitch wrote an excellent article titled “Highly Effective Data Science teams”. He states that “Great data science work is built on a hierarchy of basic needs: powerful data infrastructure that is well maintained, protection from ad-hoc distractions, high-quality data, strong team research processes, and access to open-minded decision-makers with high leverage problems to solve” [1].

In my opinion, this definition accurately describes the various dimensions that are necessary for data science teams to be effective. As such, I would like to attempt to decompose this quote further and try to understand it in more detail.

Drew Harry’s Hierarchy of Basic Data Science Needs

Great data science requires powerful data infrastructure

A common pitfall of data science teams is that they are sometimes forced either through lack of resources or through lack of understanding of the role of data scientists, to do time-intensive data wrangling activities (sourcing, cleaning, preparing data). Additionally, data scientists are often asked to complete ad-hoc requests and build business intelligence reports. These tasks should ideally be removed from the responsibilities of a data science team to allow them to focus on their core capabilities: that is utilising their mathematical and statistical abilities to solve challenging business problems and find interesting patterns in data rather than expending their efforts on housekeeping work. To do this, ideally data scientists should be supported by a dedicated team of data engineers. Data engineers typically build robust data infrastructures and architectures, implement tools to assist with data acquisition, data modeling, ETL, data architecture etc.

https://sg-dae.kxcdn.com/blog/wp-content/uploads/2014/01/managerial-skills-hallmarks-great-leaders.jpg

An example of this is at Facebook, a world leader in data engineering. Just imagine for a second the technical challenges inherent in providing over one billion people a personalised homepage full of various posts, photos and videos on a near-real time basis. To do this, Facebook runs one of the world’s largest data warehouses storing over 300 petabytes of data [2] and employs a range of powerful and sophisticated data processing techniques and tools [3]. This data engineering capability enables thousands of Facebook employees to effectively use their data to focus on value enhancing activities for the company without worrying about the nuts and bolts of how the data got there.

I realise that we are not all blessed with the resources and data talent inherent in Silicon Valley firms such as Facebook. Our data landscapes are often siloed and our IT support teams where data engineers traditionally reside mainly focus on keeping the lights on and putting out fires. But this model has to change – set up your data science teams to have the best chance of success. Co-opt a data engineer onto the data science team. If this is not possible due to resource constraints then at least provide your data scientists with the tools to easily create ETL code and rapidly spin up bespoke data warehouses thus enabling them with rapid experimentation execution capability. Whatever you do, don’t let them be bogged down in operational data sludge.

Great data science requires easily accessible, high-quality data

https://gcn.com/~/media/GIG/GCN/Redesign/Articles/2015/May/datascience.png

Data should be trusted, and be of a high quality. Additionally, there should be enough data available to allow data scientists to be able to execute experiments. Data should be easily accessible, and the team should have processing power capable of running complex code in reasonable time frames. Data scientists should, within legal boundaries, have easy, autonomous, access to data. Data science teams should not be precluded from the use of data on production systems and mechanisms need to be put in place to allow for this rather than being banned from use just because “hey – this is production – don’t you dare touch!”

In order to support their army of business users and data scientists, eBay, one of the world’s largest auction and shopping sites, has successfully implemented a data analytics sandbox environment separate from the company’s production systems. eBay allows employees that want to analyse and explore data to create large virtual data marts inside their data warehouse. These sandboxes are walled off areas that offer a safe environment for data scientists to experiment with both internal data from the organisation as well as providing them with the ability to ingest other types of external data sources.

I would encourage you to explore the creation of such environments in your own organisations in order to provide your data science teams with easily accessible, high quality data that does not threaten production systems. It must be noted that to support this kind of environment, your data architecture must allow for the integration of all of the organisation’s (and other external) data – both structured and unstructured. As an example, eBay has an integrated data architecture that comprises of an enterprise data warehouse that stores transactional data, a separate Teradata deep storage data base which stores semi-structured data as well as a Hadoop implementation for unstructured data [4]. Other organisations are creating “data lakes” that allow raw, structured and unstructured data to be stored in a vast, low-cost data stores. The point is that the creation of such integrated data environments goes hand in hand with providing your data science team with analytics sandbox environments. As an aside, all the efforts going into your data management and data compliance projects will also greatly assist in this regard.

Great data science requires access to open-minded decision-makers with high leverage problems to solve

https://www.illoz.com/group_articles_images/3248184859.jpg

DJ Patel stated that “A data-driven organisation acquires, processes, and leverages data in a timely fashion to create efficiencies, iterate on and develop new products, and navigate the competitive landscape” [5]. This culture of being data-driven needs to be driven from the top down. As an example, Airbnb promotes a data-driven culture and uses data as a vital input in their decision-making process [6]. They use analytics in their everyday operations, conduct experiments to test various hypotheses, and build statistical models to generate business insights to great success.

Data science initiatives should always be supported by top-level organisational decision-makers. These leaders must be able to articulate the value that data science has brought to their business [1]. Wherever possible, co-create analytics solutions with your key business stakeholders.  Make them your product owners and provide feedback on insights to them on a regular basis. This will help keep the business context front of mind and allows them to experience the power and value of data science directly. Organisational decision-makers will also have the deepest understanding of company strategy and performance and can thus direct data science efforts to problems with the highest business impact.

Great data science requires strong team research processes

Data science teams should have strong operational research capabilities and robust internal processes. This will enable the team to be able to execute controlled experiments with high levels of confidence in their results. Effective internal processes can assist in promoting a culture of being able to fail fast, fail quickly and provide valuable feedback into the business experiment/data science loop. Google and Facebook have mastered this in their ability to amongst other things; aggregate vast quantities of anonymised data, conduct rapid experiments and share these insights internally with their partners thus generating substantial revenues in the process.

Think of this as employing robust software engineering principles to your data science practice. Ensure that your documentation is up to date and of a high standard. Ensure that there is a process for code review, and that you are able to correctly interpret the results that you are seeing in the data. Test the impact of this analysis with your key stakeholders. As Drew Harry states, “controlled experimentation is the most critical tool in data science’s arsenal and a team that doesn’t make regular use of it is doing something wrong” [1].

In Closing

This blog is based on a decomposition of Drew Harry’s definition of what enables great data science teams. It provides a few examples of companies doing this well and some practical steps and open-ended questions to consider.

To summarise: A well-balanced and effective data science team requires a data engineering team to support them from a data infrastructure and architecture perspective. They require large amounts of data that is accurate and trusted. They require data to be easily accessible and need some level of autonomy in accessing data. Top level decision makers need to buy into the value of data science and have an open mind when analysing the results of data science experiments. These leaders also need to be promoting a data-driven culture and provide the data science team with challenging and valuable business problems. Data science teams also need to keep their house clean and have adequate internal processes to execute accurate and effective experiments which will allow them to fail and learn quickly and ultimately become trusted business advisors.

Some Final Questions Worth Considering and Next Steps

In writing this, some intriguing questions come to mind: Surely there is an African context to consider here? What are we doing well on the African continent and how can we start becoming exporters of effective data science practices and talent. Other questions that come to mind include: To what end does all of the above need to be in place at once? What is the right mix of data scientists/engineers and analysts? What is the optimal mix of permanent, contractor and crowd-sourced resources (e.g. Kaggle-like initiatives [7])? Academia, consultancies and research houses are beating the drum of how important it is to be data-driven, but to what extent is this always necessary? Are there some problems that shouldn’t be using data as an input? Should we be purchasing external data to augment the internal data that we have, and if so, what data should we be purchasing? One of our competitors recently launched an advertising campaign explicitly stating that their customers are “more than just data” so does this imply that some sort of “data fatigue” is setting in for our clients?

My next blog will explore in more detail, the ideal skillsets required in a data engineering team and how data engineering can be practically implemented in an organisation’s data science strategy. I will also attempt to tackle some of the pertinent open-ended questions mentioned above.

The dimensions discussed in this blog are by no means exhaustive, and there are certainly more questions than answers at this stage. I would love to see your comments on how you may have seen data science being implemented effectively in your organisations or some vexing questions that you would like to discuss.

References

[1] https://medium.com/mit-media-lab/highly-effective-data-science-teams-e90bb13bb709

[2] https://blog.keen.io/architecture-of-giants-data-stacks-at-facebook-netflix-airbnb-and-pinterest-9b7cd881af54

[3] https://www.wired.com/2013/02/facebook-data-team/

[4] http://searchbusinessanalytics.techtarget.com/feature/Data-sandboxes-help-analysts-dig-deep-into-corporate-info

[5] https://books.google.co.za/books?id=wZHe0t4ZgWoC&printsec=frontcover#v=onepage&q&f=false

[6] https://medium.com/airbnb-engineering/data-infrastructure-at-airbnb-8adfb34f169c?s=keen-io

[7] https://www.kaggle.com/

by Nicholas Simigiannis

The Doosra

Working in an investment bank over the past decade has provided the opportunity for many interesting conversations around what the value to society of an investment bank represents. Often the model of a “zero sum game” is proposed which suggests that finance often doesn’t add much – in terms of the transactions that banks facilitate, someone is a winner and someone else is the loser, there is no net gain to the world. Other purists would argue something along the lines of efficient allocation of resources. That initially sounded a bit too creative for my more linear reasoning, but after years in the trenches, it has developed an intuitive ring of truth to it.

Working in an investment bank over the past decade has provided the opportunity for many interesting conversations around what the value to society of an investment bank represents. Often the model of a “zero sum game” is proposed which suggests that finance often doesn’t add much – in terms of the transactions that banks facilitate, someone is a winner and someone else is the loser, there is no net gain to the world. Other purists would argue something along the lines of efficient allocation of resources. That initially sounded a bit too creative for my more linear reasoning, but after years in the trenches, it has developed an intuitive ring of truth to it.

Similarly, digital disruption suffers a questionable motive. For some enterprises, such as Uber, it may appear that the shiny plaything of some young geeks on the west coast of america has been allowed to plough through the livelihoods of real people with real jobs and families around the world. When applying such thinking to digital disruption in the realm of investment banking, the question arises as to whether there is any real value that this rather obscure digital offspring of an already often questioned enterprise can produce.

At times this line of thinking led me to check my own passion for this “new vector of commerce”. How do I ensure that my natural fascination with some “new and shiny” geek toy is not diverting what should be a cold, objective application of technology to investment banking, rather than being an excuse to pursue disruption for its own sake. How do we ensure a golden thread of validity and meaning to this exercise.

I started thinking about Google, and how I could justify what value they might have brought to the world (and not just their shareholders). I won’t pretend that I spent much time on this question, but I did come to the following example. Google maps is a fantastic application, and I probably initially loved it more for the fact that in this we have an application that is bringing the real world (travel, maps, my phone, my car) together with the digital world (the internet, GPS technology, cloud based algorithms).

However, it is a tool that many people use, and its value extends beyond that initial fascination. I have considered that in a very real way there are likely to be hundreds of millions of people that might use google maps every day to guide them on an optimal route in their cars. And, true to form, it manages to do this: either by advising detours around potential traffic jams, or by merely showing quicker routes that save time.

That extra time in traffic that has been avoided represents a very real saving in carbon emissions into the atmosphere, and real energy that would have been wasted pumping cylinders up and down in an idling vehicle. This is not a zero sum equation where google benefits and many small companies lose out. This is a very real benefit to the world where increased efficiency reduces the amount of wasted energy, and wasted time of humans. This is a net positive game to the world. In some respect the world of humans win, and the domain of entropy loses – if we are forced to put a name to it.

Personally I would feel deeply gratified if I could produce such a result that created a new benefit to either the world, or at the very least some small piece of it.

Interestingly enough, this speaks to an underlying theme which appeals to many people that are attracted to incubators of disruption, such as the Foundery. Many people do really feel that they would like to be part of something that changes the world. Perhaps this is because such incubators invoke the perceived “spirit” of Google, Facebook and other silicon valley heroes as an inspirational rally cry. I believe that the example of google maps does show that the present opportunity of disruptive technology can represent a possibility for such very real efficiencies and benefits to be created. Perhaps those seemingly naive passions that are stirred in the incubatees are valid, and should be released to find their form in the world.

So how do we harness this latent energy? Where do we direct it for the best chance of success?

Some of the technologies to be harnessed, and which represent the opportunity of disruptive technology:

  1. IoT (the internet of things):

At its most simple, this means that various electronic components have become sufficiently small, powerful and most importantly, cheap. It can become possible and economically viable to monitor the temperature, humidity, soil hydration of every single plant in a field of a farm. To measure the status of every machine on a production line in a small factory in the east rand, without bankrupting the owner with implementation costs.

Apart from sensors, there are actuators in the world such as smart locks, smart lights and the smart home which enable real-world actions to be driven and controlled from the internet. Together these provide the mechanism for the real world to be accessible to the digital world.

This extends beyond the “real“ real world: there are changes at play, not too far under the surface of the modern financial system, that are turning the real world of financial “things” (shares, bonds, financial contracts) into the internet world of financial “things” (dematerialised and digitised shares, bonds online, financial contracts online).

There are also actuators in this world, such as electronic trading venues and platforms which enable manipulation of digital financial contracts by digital actors of finance.

  1. Data is free:

The cost per megabyte of storage continues to drop exponentially, and online providers are able to offer services on a rental basis that would have been inconceivable a decade ago. The ubiquity of cheap and fast bandwidth enables this even more so.

  1. Computation is cheaper than ever, and simple to locate with cloud based infrastructure:

Moore’s law continues unabated, providing computational power that drops in cost by the day. Notwithstanding the promise of quantum computing which seems around the corner

  1. The technologies to utilize are powerful, free and easy to learn:

If you have not yet done so, have a sojourn on the internet across such topics as python, tensorflow, quandl, airflow and github. These represent free, open-source (largely) capabilities to harness the technologies above and make them your plaything. Not only that, the amount of free resources “out there” which can help you master each of these is astounding.

A brief exercise into trying to automate my house using python has revealed hundreds of youtube videos of similarly obsessed crazies presenting fantastic applications of python to automating everything from their garage doors, fishtanks, pool chlorine management systems, alarms etc. These youtube videos are short, to the point, educational, free and most importantly crowd moderated – all the other python home automation geeks have ensured that all the very good videos are upvoted and easily found; and the least fit are doomed to obscurity.

This represents another perhaps unforeseen benefit of the internet which is crowd-sourced, crowd-moderated, efficient and specific education. JIT learning (“just in time learning”) which means being able to learn everything that you need to accomplish a task five minutes before you need to solve it, and perhaps to forget everything almost immediately once you have solved it…. (That is an interesting paradigm to counter traditional education).

( P.S. if you have kids, or want to learn other stuff, checkout https://www.khanacademy.org/ )

Given the above points, it has never been easier for someone to create a capability to source information in real time from the real world, store that information online, apply unheard of computing power to that information using new, powerful and easy programming languages which can be learned online in a short period of time.

It might be a moot point that is valid at every point in time in every generation, but it has never been easier and cheaper to try out an idea online and see if it has legs.

So we have identified people with passion, a means of delivery and so now … what?

Those of you that are paying attention would realise that I have skirted the question of whether we have added any real value to the world, or feel that we can? Time will tell, and I would hate to let the cat out of the bag too early. But there is one thing that is true: if you are one of those misguided, geek-friendly, meaning-seeking, after hours change agents, or if you have an idea that could change the world, come and talk to us … the door is always open.

by Glenn Brickhill

Coding DevSecOps

Typically, IT policies cover many aspects of the technology landscape – including security. These policies are written in elaborate documents and then are stored somewhere cryptic. Finding these policies is very often a challenge.

The Enterprise Problem

Typically, IT policies cover many aspects of the technology landscape – including security. These policies are written in elaborate documents and then are stored somewhere cryptic. Finding these policies is very often a challenge.

Then we hire experts to come in and manually run penetration tests against the environment which gives us measurement feedback on both compliance and vulnerabilities.

We then take these manually generated penetration test reports and ask people in our organizations to take the results and remediate according to the test findings.

While all this is happening we make policy changes to current policies, we also bring in new software into the environment to satisfy new business requirements. At the same time new threats appear.

Each part of the process can span months, meaning this whole cycle may take multiple months.

We need to close the window, and change the time reference from months, to days to hours and ideally, real-time to match the timeframes of potential hackers. This is DevSecOps and I’m going to tell you how to do it.

The Road to DevSecOps

We’ve framed the enterprise problem, now how do we apply DevSecOps to it? Well the answer is to delve a little more into DevSecOps.

DevSecOps = DevOps + Security ( Sec )

In the world of DevSecOps as you may predict we have three teams working together. Development, the Security team and Operations.

The “Sec” of DevSecOps introduces process changes to the following elements of an organization:

  • Engineering
  • Operations
  • Data Science
  • Compliance

This may seem a little daunting, let’s unpack these changes.

Engineering & Operations

Engineering refers to how you build with security in mind and bring security into your engineering pipeline. A typical engineering pipeline shown below:

As we practically observe code eating the world the engineering pipeline for the development, security & operations build team looks very similar. Coding best practices apply to all. Everyone needs to change the way that they think. We are no longer working in silos but rather working together in a well co-ordinated and harmonious manner.

Development team

  • Writes the system code with security in mind
  • There are changes to the engineering pipeline (policies & practices), most notably static code analysis using SonarQube that looks for security vulnerabilities

Operations team

  • Writes Puppet code to manage infrastructure state to application layer as well as comply against OpenSCAP policies
  • Static code analysis by means of PuppetLint

Security team

  • Experiments, automates and tests new security approaches and creates Puppet modules

Security operations

  • Continues to detect, hunt and then contain threats
  • Writes OpenSCAP policies that aligns to IT policy

Some examples of the parallels:

>> Static Code Analysis: Talk about SonarQube for Developers and PuppetLint for Operations

>> Automated unit test: Talk about Beaker for Operations and xUnit for Development

With this convergence there is no reason not to predict that the security team will soon follow similar practices when the toolsets reach the right levels of maturity.

 Data Science & Compliance

Once you start collecting data you can apply reverse looking analytics and forward looking data science approaches. The data collected from DevSecOps can be used to augment already well

established security data. In particular Puppet by its nature enforces a specific state, if this state changes without sanction these events can be used as ‘trip wires’ to detect potential intruders.

Measurement gives you compliance measurement feedback against your policies. This occurs at the cadence that you configure Puppet which defaults to 30 minutes.

To Conclude

In the new world – instead of having IT policies as documents, we codify them. Sec writes the policies and then Dev & Ops work together to write the remediation code.

Measurement moves from a manual state to an automated state. We write the policy code in OpenSCAP, remediate the policy breaches with Puppet code we have written. Threats, environment change and policy change occur as we expect.

The difference in the DevSecOps world is that we update and the policy & remediation code in minutes, we then rollout to the organization in hours.

That’s the true power of DevSecOps!

DevSecOps talks to how the principles of DevOps can be applied to the broader security context.

The path to building a culture of security in an organization needs to follow a similar path to that of DevOps: set the right expectations of outcome then empower and measure.

In this world of security challenges, can you afford not to do DevSecOps?

The Future

Look out for my next post on how to apply DevSecOps in a containerized environment!

by Jason Suttie

The changing world around programmers

In today’s ever-changing world, we find that businesses have become more concerned about what you can do rather than what qualification you have.

Gabriel blogIn today’s ever-changing world, we find that businesses have become more concerned about what you can do rather than what qualification you have. This paradigm is becoming more apparent as companies have an unbelievable shortage of decent coders who are able to deliver to their expectations. This gap in the employment market is increasing as the average university turnout of BSc Computer Science graduates is far less than actual demand.

 This situation has led the industry to change the way they look at qualifications and to focus more on a person’s ability to code and learn. If you are a self-taught coder and have an understanding of industry-relevant technology, you are in a much better position than someone who still has to go into university and learn coding there for the first time. A few companies are willing to take the risk of hiring someone without formal coding qualifications, and have reaped the rewards in taking those risks. The coders that they hire generally seem to be more aware of what new technology is available, and are more willing to learn something new in order to help them grow further.

 We are starting to see a paradigm shift in the industry and the way in which people think. The stack overflow statistics show that the proportion of self-taught developers increased from 41.8% in 2015 to 69.1% in 2016. This shows that a lot of developers are self-taught and a lot more people are teaching themselves how to code each year. People who start to code from a young age show such passion for coding and in combination with their curiosity for learning something new, their love for it speaks volumes. To have the ability to create anything that they can think of on a PC, and to manipulate a PC to behave like they want it to and have a visual representation of this, is unbelievable.

 For those interested in teaching themselves how to code there are many websites to look at. Here is a list of 10 places you can learn coding from, but I will list the top 3 places that I learnt the most from:

Those websites have their own way of teaching code and if youcombine this with some Youtube videos from CS50 and MIT OpenCourseWare you will be all set to learn at your own pace. Hackerrank is a good way to test everything you learnt and you can see how you rank against the world.

 WeThinkCode_ is an institution to learn coding, for anyone from ages 17-35 years old. Their thinking is that you do not need to have a formal qualification to be a world class coder. More institutes like this are opening across the world. Having a wide age gap illustrates that you are never too old to learn how to code. There are also more and more coding education opportunities for young people. It is really easy to learn how to code from a young age as that is when your mind is at its prime to learn new things and adjust to constant change.

 In a programmer’s world you are constantly learning new things and this is what makes our jobs exciting.

The world is ever-evolving and we all need to keep adjusting our mindsets on how we look at things, otherwise we will be left behind while everyone moves forward.

By Gabriel Groener

Why it took 400 years to invent the wing

You would be forgiven for thinking the three photographs below were various “Wright Flyers” piloted by renowned flight pioneers Wilbur and Orville Wright. They’re not – They are photographs of different flights that took place at roughly the same time…

THE MYSTERY OF SIMULTANEOUS INVENTION

You would be forgiven for thinking the three photographs below were various “Wright Flyers” piloted by renowned flight pioneers Wilbur and Orville Wright.

NPS.gov
NPS.gov
Wright-brothers.org
Wright-brothers.org

 

Wright-brothers.org
Wright-brothers.org

They’re not – They are photographs of different flights that took place at roughly the same time as the famous Wright brothers’ flight. Not all of these inventors knew of each other’s existence prior to their inventions. This bizarre case of “simultaneous invention” has occurred many times before, and since, the Wright brothers’ flight. The polio vaccine was developed by three separate scientists almost at once. The patent for the telephone was filed by two separate individuals on the same day.

Why does innovation occur simultaneously? We tend to have an idealised view of how scientists work. We have a picture of an individual in a workshop making a few sketches and shouting out in joy at having thought of the wing. If this was indeed the case, then the occurrence of simultaneous inventions would almost defy logic.

The reality is that the inventors and innovators captivate the views, thoughts and ideas of their day as well as existing technology, and it is this trait, that explains the phenomenon of simultaneous discovery.

The story of fixed-wing self-powered flight

The first recorded study of flight was Leonardo Da Vinci’s “Codex on the Flight of Birds” in 1505. John Smeaton was the first to attempt to quantify the phenomenon of lift prior to 1800. Using the concept of lift, George Cayley just after 1800 conceived the concept of cambered airfoil and made the world’s first glider. The glider could barely move any practical distance. Otto Lilienthal, in 1889, took experimentation to a new level. By absorbing the thoughts of his day, he made an astonishing 2500 glides and documented his findings in the famous “Lilienthal tables”. The Wright brothers could not emulate the data in the Lilienthal tables because of an error in the concept developed by Smeaton over 100 years earlier. Out of frustration, they went over and above Lilienthal’s experiments by creating the world’s first rudimentary wind tunnel. They realised that “camber”, “aspect ratios” and “angle of attack” all contributed to various lifts. And so, 398 years after its first study, Wright Wing Number 31 was selected for the historic flight. The wing, by itself, was insufficient for the flight – they had to procure the latest internal combustion engine to power the plane. Luckily for them, this had been developed in parallel, and with its own intricate history. The Wright brothers, by profession, were bicycle manufacturers, not backyard inventors or carpenters – the perfect candidates for flight pioneers. Think light weight rivets, spokes, wheel rims and tubes.

The Adjacent Possible explains simultaneous discovery

The same fascinating story can be found in numerous other inventions such as the Gutenburg press. The movable type, the press, paper and ink all have stories of the their own. And few of them were traced back to “Eureka” moments. Steven Johnson first proposed the concept of an adjacent possible which originally has its roots in microbiology. As Steven Johnson writes in the Wall Street Journal:

“the [adjacent possible] boundaries grow as you explore them. Each new combination opens up the possibility of other new combinations. Think of it as a house that magically expands with each door you open. You begin in a room with four doors, each leading to a new room that you haven’t visited yet. Once you open one of those doors and stroll into that room, three new doors appear, each leading to a brand-new room that you couldn’t have reached from your original starting point. Keep opening new doors and eventually you’ll have built a palace.”

We can therefore argue that no matter how much of a genius Da Vinci was, he could not have possibly made a flying machine back in 1505. He was not at the boundary of the adjacent possible. He most certainly contributed to it as he was part of the enlightenment and laid the early foundations of putting innovative thoughts on paper. John Smeaton’s lift equation was wrong but it was a critical contribution in that it attempted to quantify the mysterious phenomena of lift into an equation and enabled Otto Lilienthal to record his famous tables.

What can Fintech Learn about the Adjacent Possible?

The story of the wing is an extreme case study of iterative innovations towards a single invention.

Players in the Fintech space could learn from this theme. Innovation happens on the boundaries of the adjacent possible. For example, crypto currencies could not be implemented prior to the ability to hold distributed ledgers on multiple databases connected in a common consortiumEureka moments are indeed rare. Innovation initiatives should reach out to the world to absorb the thoughts and ideas of the day. Businesses should look within their own boundaries to find their own “Lilienthal tables” – to see what worked and what didn’t, in order to innovate effectively.

The Palau island tribes did, in theory, implement a blockchain in 500AD but the “ledger” was effectively narratives held by the tribes elders

by Dejan Popovic

Which race are you in?

As we hurtle head on in 2017 its becoming increasingly clear that no matter what generation you find yourself in – Xers embracing tech, Y’s passionately living the dream or Z pushing us all faster than we ever believed we could go – if you are not concentrating…this digital world will run right past before you blink.

http://europe2017.finovate.com/
http://europe2017.finovate.com/

At Finovate 2017 in London last week, I was struck firstly by the intensity of this pace – the leaps that tech has taken over the past year, but also, and more importantly, by the spirit of partnership.

No longer are we in a world where competition is about being the fastest or the smartest, we are living in a world where winning is about bundling those that are faster and smarter than you into meaningful solutions for the business you are in, and the clients that you serve.

In banking it’s too late for us to say “let’s build our own” or “let’s throw money at disruption”; we need to get our heads around connecting fintech dots to build the best solutions for our clients. In biometrics and authentication, the solutions are overwhelming, similarly in app design and integration.  Banking is less and less about paper trails and complicated products and more about integrating whole life solutions with ease of use and integrated platforms. It’s not at all about selling products and more about connecting the right client to the appropriate product they need for the time of their life that they are in – most often aided by a funkily named chatbot.  The world of social media and banking have converged already (yup ship sailed), payments is fast becoming something everyone does …everyone! We can already buy packaged analytics and information about pretty much anything we need.

Banking has morphed from functional practicality to gorgeous design, insightful user experience and lifestyle products that adjust to the needs of its customers. Tricky thing is that much of that “banking” isn’t coming from banks! So what on earth should banks be doing?

Concentrating? Yes. Trying to keep up? No. Collaborating? Absolutely!

Finovate entrepreneurs brought solutions to banking problems we never even knew existed. They challenged views of what banks do and encouraged us all to ask “how can we help you help us help our clients?”  More importantly though, they showed what collaboration brings.  Over and over as the 7 minute spots passed by, it was clear that these entrepeneurs are building on what each other are building.  Each using bits of what others had built, to supersize the solutions they were prototyping.

And that is the way to stay in the race! So as we train for the year ahead, we need to make sure we have the insight to navigate the way forward, the partnerships with fintechs to supersize our banking offerings and the deep relationships with clients to package this stream of incredible ideas in ways that makes them not only satisfied but thrilled with the way they interact with our ecosystem.

by Liesl Bebb Mckay

The Modern Programmer

IT professionals often don’t get an honest portrayal in the entertainment industry and, for better or worse, the mass perception of Computer Science has been influenced by what people see on their TV screens. Either we sit in a dingy dark room, littered with empty energy drink cans, staring at a terminal with green font flashing and passing by at light speed – with sound effects, or we are cool rich guys creating programs that become self-aware.

IT professionals often don’t get an honest portrayal in the entertainment industry and, for better or worse, the mass perception of Computer Science has been influenced by what people see on their TV screens. Either we sit in a dingy dark room, littered with empty energy drink cans, staring at a terminal with green font flashing and passing by at light speed – with sound effects, or we are cool rich guys creating programs that become self-aware. There really isn’t a middle ground and these perceptions either drive people to developing an insatiable curiosity in the field or becoming fearful and believing that they aren’t mentally fit to join the club.

http://i.imgur.com/heb9csO.jpg
http://i.imgur.com/heb9csO.jpg

The demographic of the modern programmer isn’t what it was back in the 70’s. Most IT professionals were – well…Professionals. They were mathematicians, engineers, scientists, accountants, etc. often in their 30’s or 40’s. The programming industry was almost 50% women. What on earth happened?

Well, I have a theory. Computer Science (CS) wasn’t a course at any universities at that time, so youngsters really had no way of entering the field. Not to mention the fact that what they called a computer back then isn’t what we have today. They were big, expensive and obviously fewer. There were no operating systems. They wrote code by hand which was then converted into punch cards that could be fed into the computer and you had better pray that what you wrote was correct – which, if you code, you know it often isn’t – because then you would have to start that lengthy process from scratch. Blessed are those that came before us, for they were a resilient few. By the time we had a CS course it was the 80’s and young adults could learn how to code.

http://i.imgur.com/27vs3iD.jpg
http://i.imgur.com/27vs3iD.jpg

The 80’s was definitely one of the most defining times in modern history. We saw technology really being embraced in the media. Back to the Future, Ghostbusters, Star Wars, Terminator and many more franchises showed us a world of technology that seemed almost impossible. In lots of ways we are still catching up the imaginations of the filmmakers and science fiction writers. But I find this time very interesting because it gave birth to the geek culture which has lasted to this day. This culture was very young and male dominated. It was a kind of cult to those who were part of it. This must have driven the women away. Women in general still don’t get the culture. Heck, even I don’t get it to the degree of hardcore followers. Now think about how we perceive these “geeks” in society. Beady eyed, brace faced, drooling, good-grade-getting teens with bad acne (is there good acne?) and thick glasses, always getting bullied by the “jocks”. Truth is, in a quest to fit in, teens only hang out with the group that they relate to and/or accepts them. Learning became the uncool thing and Disco was in. The media neatly crafted and packaged nerd culture. Being a cool kid meant you didn’t even greet the nerd – unless shoving someone into a wall counted as a greeting. And so that was that. Programmers were part of a culture that embraced creativity, logic and intelligence and frowned upon anything less, because in order to be a programmer you needed to love learning and solving problems. Being a cool kid meant you had to love partying, gossip and creating problems.

http://www.philiployd.com/wp-content/uploads/2016/04/geek.jpg
http://www.philiployd.com/wp-content/uploads/2016/04/geek.jpg

Things have changed somewhat. Programmers today come in different shapes and sizes. Still not many hourglass shapes, but we’re getting there. The next generation of teens will definitely be more in-tune with technology and the true culture of the geek or the “hacker”. Those that fail to see the power of new technologies will be left behind. Computers are so much more accessible and all schools are starting to teach coding. With innovative colleges like We Think Code and 42, the future of what we perceive as an IT professional will be completely different to what we have today.

we-think-code-banner (003)

It’s now up to us to make sure that our kids become programmers rather than the programmed. It’s in the small things that we spot the young coder. The little kid that breaks his/her toys to find out how they work. Kids are naturally curious and it’s up to us to nurture that curiosity and not reprimand or punish them for it. We interact with technology every day and we would only be empowering them by encouraging them to learn how to control that technology as creators in the same way that we might teach them how to play a musical instrument. I envision a world where the modern programmer is anyone, in a society that frowns on those that shun learning. Let’s make it happen.

by Sherwin Hulley

Looking through the exponential looking glass

We are now in what some are calling the next industrial revolution. In this time, we are experiencing exponentials more than at any other time in history. An example of this is the exponential decrease in the cost of solar electricity generating technology: the cost of electricity will, in our lifetime, tend to zero.

We are now in what some are calling the next industrial revolution. In this time, we are experiencing exponentials more than at any other time in history. An example of this is the exponential decrease in the cost of solar electricity generating technology: the cost of electricity will, in our lifetime, tend to zero. This not only allows us to solve our own electricity challenges here in South Africa but opens many possibilities on how we solve other challenges such as water security. With the cost of electricity tending to zero the business case for large scale desalination plants more realistic. Given that South Africa has just survived the worst drought in living memory many people are now conscious of water security and are looking for their own solutions.

 

I’ve just described some of the positive effects of exponentials. There are going to be some effects that will be harder humanity to deal with. Let’s consider another exponential that encapsulates artificial intelligence, deep learning and robotics. We already have driverless cars, experiments on primitive driverless cars began in the 1920’s – the technology is now mainstream. Whether human beings are ready for them is another question. The impact driverless cars have is profound, firstly we no longer need human drivers, secondly, we have fewer accidents as computers driver better than humans and thirdly all our current motor vehicles become obsolete and have little value other than in remote areas of the world.

 

Let’s extend a bit further into robotics and genetics. Robots will replace human farmers as they will be able to manage the entire farm. Crops will be farmed as raw materials for the advanced food printing technology. Animals will no longer be farmed as any protein can now be printed more cost effectively. Maintenance robots will maintain other robots as well as themselves. Food will be almost free as costs for electricity, water and robots are negligible. In this time of abundance human beings will live in homes built by machines, food, water, healthcare and education will all be free. Disease will be eradicated by genetic medicine that is printed at home after diagnosis by doctor robots. Our children will all have online tutors who are dedicated and configured to optimally educate each child. Humans may take real holidays from time to time but mostly they will prefer alternative reality holidays that they can take from the comfort of their homes.

 

In this time of great abundance, human beings have lots of time on their hands. With many needs met, people are looking to find purpose in their lives. There are many schools of thought around what we as humans will be focusing on. I believe that the simple answer is that humans will be looking forward at bigger challenges. Humanity has the gift of being able to manifest to the extent of imagination and now there will be focus on terraforming deserts on Earth into arable land, colonization of the Moon and Mars. Since very few people have volunteered for Moon and Mars missions the robots are sent to establish good living conditions before humans move in earnest. Humanity has finally closed the ecological divide – that is the divide in the mind of humans between their actions the impact on the planet. Technology has been very useful in helping to undo the sins of the past, vast ships extract plastics from the ocean. Localized plants remove toxins from the air. Genetically modified plants and animals extract poisons from our waterways. Ecosystems begin to re-establish after animal farming stops. Nature reserves still exist to contain the more dangerous animals.

Insert-Image-V3 (002)

Exponentials are the key to these dramatic shifts in the world and these changes may happen faster than we think. We are experiencing exponentials in almost every aspect of our lives, look forward to profound positive changes for both humanity and the Earth soon.

by Jason Suttie

 

A case of keeping up with the Joneses?

The world is changing. Drones are fighting in armies, driverless cars are no longer a fiction of someone’s imagination, robots have the ability to outsmart humans in offering legal and financial advice. Exciting? Most definitely. But for a developing country such as South Africa there is also a sense of uneasiness.

South Africa and the Fourth Industrial Revolution

Source: genesisnanotech.com
Source: genesisnanotech.com

The world is changing. Drones are fighting in armies, driverless cars are no longer a fiction of someone’s imagination, robots have the ability to outsmart humans in offering legal and financial advice. Exciting? Most definitely. But for a developing country such as South Africa there is also a sense of uneasiness. How will our economy keep up with a changing world while having to fight poverty, inequality, corruption? As one of the most unequal societies of the current age one has to wonder what technological advances will do to the already high Gini coefficient? Are we entering a dystopian world order where some countries will ride the wave of a revolution and others will be left behind, feeding on the scraps?

Prof. Klaus Schwab from the World Economic Forum has proposed that a fourth industrial revolution is imminent. An industrial revolution is seen as a change in the basic economic structure, driven by innovation and invention. In the late 18th century coal and locomotives introduced the first industrial revolution with mechanization being the key driver.  The fourth industrial revolution will fundamentally change the world that we live in through so-called cyber physical systems where the natural, human and digital world meet. In short, extreme automation and connectivity might mean that boundary lines between humans and technology are blurred with concepts such as artificial intelligence, virtual reality and the internet of things taking civilization where it has not gone before.

While South Africa’s economy relies heavily on its natural resources the question is whether the country has progressed sufficiently on the ladder of previous industrial revolutions, or rather whether a platform has been created from which to launch into the rising age. One simply has to look at the state of railways and even the huge amount of manufacturing that happens off-shore to wonder whether South Africa has been able to progress industrially with the rest of the world. UBS highlights in a white paper (Davos 2016 White Paper) that South Africa is behind the curve with regard to the evolving of manufacturing along with demographics. In effect, the failure to move to high level of manufacturing with the demographic prime could indicate that SA has not adequately adapted to the second and third industrial revolutions.

The question remains whether the fourth industrial revolution would be a case of keeping up with the Joneses or whether it would be possible to fast track development in order to launch South Africa into the new era.

Africa is a continent that upholds a certain reputation for innovation particularly to overcome obstacles often created by the lack of development in a certain area. The continent is also seen as an early adopter of technology, with success stories of financial applications such as M-Pesa flying the flag for an innovative continent. Pockets of excellence mean that it is not all doom and gloom for South Africa. The financial sector is one of the best in the world and may well be a field in which new technologies will get traction.

The WEF looks favourably towards South Africa in terms of innovation and new fourth industrial revolution indexes in their Global Competitiveness Report (2016-2017) (WEF – Global Competitiveness Report ) with the following global ranks: innovation and sophistication (31st), business dynamism (50th) and innovation capacity (38th ) perhaps indicative of the silver lining of promise that the country will punch above its weight in the revolution.

Although ranked low in basic requirements and some efficiency parameters indicating a lag in fully adapting to previous industrial revolutions, positive innovation rankings paired with the right industry, such as financial markets, could position South Africa favourably in terms of moving towards the fourth industrial revolution.

It would be easy for South Africa to adopt a fearful approach towards the new era of automation and robotics. In a country with high unemployment low-skilled workers are likely to face a world where the job-market has no space for them. Some pessimists are predicting a world where humans are not needed. The reality is that change is coming and being a country that is able to adapt is non-negotiable. Education has been a pain point and South Africa is ranked a low 123d on the Global Competitiveness index for Health and Primary Education. Education is a critical factor to the challenges that a new revolution would bring forth and ensuring that a new generation can hold their own in a changing world will make the difference between those running the race well and those falling out along the way. Adapting education models is an important measure to be put in place if South Africa wants to approach the fourth industrial revolution with courage.

In our experience, it is evident that successful disruption is often birthed within a start-up atmosphere. Where current incumbents, especially in the financial sector, are faced with issues such as integration and legacy systems, a start-up with a dedicated and talented team and a great innovative idea have a greenfield approach that lends itself toward disruption.

Drawing the parallel to country-wide development there is a case for cheering on Africa and South Africa’s jump to the fourth Industrial Revolution modelled as a start-up with leeway to embrace the new era and its corresponding shifting boundaries. Leapfrogging some of the essential factors of previous revolutions toward a new disruptive way of using computer systems to address fundamental social problems. But investments in skills, particularly software development and the required infrastructure will need to be fast-tracked to create an enabling environment for the innovative nature of South Africans – to not just keep up with the rest of the world but to leverage a strong innovative culture and excel.

                               by Charlotte Hauman