Innovation is a choice

There’s nothing like a week in another city to get the innovation juices flowing, and London Fintech Week was exactly that (thank you Luis and team for another week of great debate, networking and insights).

Source: Pexels.com/photo-512249/

There’s nothing like a week in another city to get the innovation juices flowing, and London Fintech Week was exactly that (thank you Luis and team for another week of great debate, networking and insights).

So, what is new in the world of Fintech? Well, if the speakers and panelists are to believed, and the messages were far too similar and consistent for them not to be…

…AI is playing an increasingly important role in the world and indeed in the world of banks – …  . It is clear that to win in the investment banking game will still require smart people – but we must couple smart banker types with AIs and we must change our definition of “banker types” to include engineers and mathematical PHDs.

…Blockchain is here, and it’s all grown up. No longer a concept for alternative funding and the underworld, the cryptocurrency conversation is upping the volume at the highest levels with countries like Canada, the UK and Singapore all running projects, and banks of all sizes experimenting and building applications both in crypto-coins and blockchain technologies. Even the highly volatile crypto-currency prices over the week did nothing to dampen the enthusiasm. With the rise of open source, I expect we will see increasing opportunities to move from our existing centralized models to new blockchain enabled ones in many economies and industries.

…Trust is no longer about relationships, nor the strength of your brand. It’s about ease of use and, increasingly, peer review. Customers are no longer seeking a similar experience to the one they get from other banking brands, they’re looking for an experience like they get from the mega brands like Apple and Amazon. Banks are going to need to up their game – and quickly!

Clients know that if they are not paying, they are the product. Both banks and clients know the power of their data – what will this mean in the future? How will this change their expectations of service? Security now becomes as important as service; will clients demand due diligence of their service providers to ensure that their data is secure?

…Innovation is a choice – it doesn’t just happen. This is potentially the most important message of all. Entities that are leading in the start-up and innovation space are choosing to  – they are seeing the possibilities that innovators bring and are finding creative ways to enable them. The innovation choice is being made at the highest level – countries like India, Canada, the UK, Germany, the Netherlands, and China are all facilitating innovation communities and the start-ups and banks coming out of those countries are moving faster than others because of it. It is a choice because there are millions of reasons and costs involved with creating change, but forward-thinking leaders recognize the importance and their choice to enable, means they are leaping ahead.

…It’s organizational cultures that will make the space for innovation and those cultures look to leaders for the messages they need. Coincidentally, I just finished reading “Under the Hood” by Stan Slap where he describes how to maximise business performance. Culture understands leadership motivators beyond words and culture works exceptionally hard to protect its own existence – so innovation will simply not happen without leaders giving the right messages. Innovation is a choice leaders have to make and their actions will send the clear message.

Everything we know, the way we work and the way we behave was all once created as a leadership or cultural choice. In this exponential era, we will need to change the stories we tell, the way in which we work, the technologies we believe in. It’s ridiculously exciting – and it’s moving…well, exponentially! At last years’ event, there was talk of what blockchain is and what AI could conceivably do, this year it was all about what businesses are being built on these technologies. I can’t wait to see what the next year brings!

by Liesl Bebb-McKay

 

A curriculum for growing your data science skills (almost) for free

With the plethora of free (or at least reasonably priced) high-quality massive open online courses (MOOCs), free online textbooks, tutorials, the tools available for aspirant data science apprentices are many and varied. From taking courses offered by Coursera to freely available eBooks and code examples to download from Github, there are many useful resources at our disposal.

I hear and I forget. I see and I remember. I do and I understand – Confucius

With the plethora of free (or at least reasonably priced) high-quality massive open online courses (MOOCs), free online textbooks, tutorials, the tools available for aspirant data science apprentices are many and varied. From taking courses offered by Coursera to freely available eBooks and code examples to download from Github, there are many useful resources at our disposal.

Demand for data science skills remains consistently high. IBM predicts that appetite for data scientists will grow 28% by 2020. Job postings for data science skills in South Africa are rising rapidly as companies begin to realise the true value of their data initiatives.

According to IBM, the current most desirable and lucrative skills include machine learning, data science, Hadoop, Hive, Pig and MapReduce. It is interesting to note just how many data engineering type skills are in demand. I recently started to set up a data lab at the Foundery based on the Hortonworks distribution of Hadoop, and I can understand why this is true – (big) data engineering is unnecessarily complicated!

Over the last few years, I have completed (and sometimes part-completed) some data science MOOCs and tutorials. I have downloaded free eBooks and textbooks – some good and some not so good. These, along with the MOOCs, have become my primary source of knowledge and skills development in the data science domain. I am finding this form of online learning to be a very efficient and effective way to grow my knowledge and expertise. However, my choice of which courses to do has been haphazard at best and having this much choice has also made it difficult to find the right courses to pursue, often leading to me abandoning classes or not learning as well as I should.

The purpose of this blog, therefore, is twofold: to create a thoughtful and considered curriculum that I can follow to elevate my data science mastery and to share with you some of the resources that I have collated in researching this proposed curricula. Whether you are a seasoned data science expert, or an absolute beginner in the field, I believe there is value from some, if not all, of the topics in the curriculum.

sourced from http://blogs.edweek.org/edweek/edtechresearcher/2014/07/moocs_and_the_science_of_learning.html

The ultimate ambition of completing this proposed curriculum is to vastly (and more efficiently) improve my mathematical, statistics, algorithmic development, programming and data visualisation skills to go from a journeyman level understanding of data science to full-on mastery of advanced data science concepts.

I want to DO so that I can better UNDERSTAND. Eventually, I’d like to understand and implement advanced machine learning and deep learning concepts (both from a theoretical and practical perspective) as well as obtain more in-depth expertise in big data technology. I also aim to improve my data visualisation skills so that I can have more impactful, interesting and valuable discussions with our business stakeholders and clients.

The day that I can have a debate with my maths colleagues about advanced mathematical concepts, compete with the computer scientists on Hackerrank coding challenges, run my models on a big data platform that I have set up, create a beautiful and insightful visualizations AND make this all understandable to my wife and daughter is the day when I know I have been successful in this endeavour.

I proposed this curriculum based on the skills that are commonly acknowledged to be required for data science as well as on course ratings, popularity, participant reviews and cost. I have tried to be as focussed as possible and my thinking is that this is the most efficient plan to get deep data science skills.

This curriculum will be based on open-source programming languages only, namely Python and R. My initial focus will be on improving my Python skills where possible as I want to get this up to a level where I can implement Python-based machine learning models in NumPy/SciPy. I do acknowledge, however, that for many of the stats and maths related courses, R is often preferred and in that event, I will switch.

Given my work commitments and the fact that we have a new (and very loud) addition to our family, I think that I would likely only be able to devote 10 hours a week to this challenge. My proposed timetable will, therefore, be based on this estimate. The current estimate to fully complete the curriculum is at 110 weeks or just over 2 years! This is going to be a long journey…

https://unsplash.com/collections/136866/journey?photo=7RIm0GqvvkM

I plan to update this blog periodically as and when I complete a course. My updates will include a more detailed summary of the course, an in-depth review and score, how much it cost me as well as tracking how long the course took to complete relative to the advised timeframe provided by the course facilitators. My time estimates will be slightly more conservative relative to the time estimates for each course as, in my experience, it always takes longer than suggested.

Thank you for reading this far. If you wish to join me in growing your data science skills (almost for free) and help keep me honest and accountable in completing this curriculum, then please do read on.

Data Science Curriculum

0. Supplementary resources and setup

Sticking to the blog’s theme of finding low-cost resources for this curriculum wherever possible, I have found a few high-quality free online maths and stats textbooks. These will serve as useful reference material for the bulk of the curriculum. They are:

  • Think Stats – a freely downloadable introductory book on Probability and Statistics for Python. Code examples and solutions are provided via the book’s Github repository.
  • An Introduction to Statistical Learning with Applications in R – another freely available book that is described as the “how to” manual for statistical learning. This textbook provides an introduction to machine learning and contains code examples written in R. There is online course material that accompany this book and this can be found here as well as here. I will use this manual and potentially their associated MOOCs as a reference when I begin the machine learning component of this curriculum.
  • Introduction to Linear Algebra is the accompanying linear algebra reference book for the MIT Open Courseware MOOC. This book will also have to be purchased should the MOOC require this.
  • Although not free, Python Machine Learning by Sebastian Rashka has good reviews as a reference book for machine learning applications in Python. The book also has an accompanying code repository on Github.
https://unsplash.com/collections/488/books-libraries-paper?photo=6ywyo2qtaZ8

1.  start by focusing on maths and stats

The first section of the curriculum will allow us to concentrate on redeveloping fundamentals in mathematics and statistics as they relate to data science. University, over a decade ago now, was the last time I did any proper maths (yes, engineering maths is ‘proper mathematics’ to all you engineering-maths naysayers).

Regarding learning the mathematics and statistics required for data science and machine learning, I will focus on the following courses.

  • Statistics with R Specialisation – There were many courses available to improve my stats knowledge. Ultimately, I settled on this Coursera specialisation by Duke University as it seemed the most comprehensive and the textbook seems a good companion book. This specialisation comprises 5 courses – Introduction to Probability and Data, Inferential Statistics, Linear Regression and Modelling, Bayesian Statistics and a capstone project written in R. Each course will take 5 weeks and will require 5-7 hours of effort per week. I will use this set of courses to improve my R skills, and I will audit courses if possible or I may have to pay for the specialisation. [Total time estimate: 250 hours]
  • Multivariable Calculus – (Ohio State University) referencing the Multivariable calculus sections of the Khan Academy where required. This highly rated course (average rating of 4.8 out of 5 stars) will provide me with a refresher of calculus and will take approximately 25 hours to watch all the videos. I think I can safely add the same amount of time to go through all the tutorials and exercises putting the length of this study at 50 hours. [Total time estimate: 50 hours]
  • Linear Algebra – (MIT Open Courseware) referencing the Linear Algebra sections of the Khan Academy where required. I don’t know how long this should take to complete, so I will base my estimate on the previous courses estimate of 50 hours. I chose this course as the lecturer of the Linear Algebra textbook, MIT Professor Gilbert Strang, conducts this MOOC. [Total time estimate: 50 hours]

2.  Time to improve my management skills of data science projects, experiments and teams

A large part of work at my previous employer and at my current job at the Foundery is to manage various data science projects and teams. I have a lot of practical experience in this domain, but I don’t think it would hurt to go back and refresh some of the core concepts that relate to effective data science project management. To this end, I managed to find an appropriate Coursera specialisation that aims to help data science project managers “assemble the right team”, “ask the right questions”, and “avoid the mistakes that derail data science project”.

  • Executive Data Science Specialization – John Hopkins University. The entire specialisation is only 5 weeks long, and requires 4-6 hours a week of effort. The courses that are on offer are titled “A Crash Course in Data Science”, “Building a Data Science Team”, “Managing Data Analysis”, “Data Science in Real Life” and “Executive Data Science Capstone”. I wasn’t able to obtain rating information for this specialisation. [Total time estimate: 50 hours]

 3.  Improve my computer science and software engineering skills

When I first started out, I managed to pick up a few Unix skills (just enough to be dangerous as evidenced when I once took out a production server with an errant Unix command). Since then, and over time, I have lost the little that I knew (luckily for production support teams).

New and exciting software engineering paradigms have emerged, such as DevOps and code repository solutions like Github are now commonly used in both the data science and development industries. As such, I thought that some study in this domain would be useful in my journey.

I would also like to increase my knowledge of data structures and algorithms from both a practical and theoretical perspective. To this end, I have found an exciting and challenging University of California San Diego Coursera specialisation called “Master Algorithmic Programming Techniques”.

The courses that I am planning to complete to improve my computer science and software engineering skills are:

  • How to use Git and GitHub – a freely available course offered by Udacity with input from Github. This course is a 3-week MOOC and is rated 4.5 out of 5 stars out of 41 student reviews and will require 6 hours of commitment per week. This course introduces Git and GitHub and will help me to learn how to use better source control, which in turn will greatly assist with project delivery of medium to large sized data science projects. [Total time estimate: 30 hours]
  • Introduction to Linux – a freely available course from edX. This is an 8-week course rated 4 out of 5 stars by 118 student reviews with over 250 000+ students enrolled. Thoroughly covering this material will take between 40-60 hours per the course notes. Gaining a firm understanding of Linux will allow me more control when using the open source data science environments and tools. [Total time estimate: 60 hours]
  • Introduction to DevOpsUdacity. This free course introduces that concept of DevOps and explains how to implement continuous integration, continuous testing, continuous deployment and release management processes into your development workflow. I am very interested to see how this could be applied to the data science world. The course does not have a rating and is 3 weeks in length requiring 2-3 hours per week of effort. [Total time estimate: 10 hours]
  • Master Algorithmic Programming Techniques – This Coursera specialisation by the University of California San Diego comprises 6 courses —Algorithmic Toolbox, Data Structures, Algorithms on Graphs, Algorithms on Strings, Advanced Algorithms and Complexity, Genome Assembly Programming Challenge Each course is 4 weeks of study, 4-8 hours per week. The individual courses were rated between 3.5 – 4.5 stars.

What excited me about this specialisation is that I would get an opportunity to learn and implement over 100 algorithms in a programming language of my choice from the ground up. I think that this would certainly improve both my knowledge about algorithms as well as my programming skills.

After looking a bit deeper at the course structure, it seems as if this specialisation is paid for at $49 per month until you complete it. So, the faster I do this, the cheaper it’ll be – nice incentive! [Total time estimate: 235 hours]

4.  Improve my base data science skills and up my Python coding abilities

At this stage of the curriculum, I would have solidified my maths and stats skills, improved my computer science and software engineering skillset, and brushed up on some data science project management theory. Before embarking on intensive machine learning material, I think that it might be a good decision to get back to basics and look at improving my base data science and visualisation skills and upping my Python coding abilities while at it.

One of my goals for this curriculum was to improve my communication skills by becoming a real data story-teller. An effective way to do this is to learn how to visualise data in a more concise, meaningful and, I guess, beautiful manner. I say beautiful because of an amazing data visualisation website called Information is Beautiful. Check it out; you won’t regret it.

  • Learning Python for Data Analysis and Visualisation – Udemy. Jose Portilla’s Udemy course is highly rated at 4.6 stars out of 5 from over 4 220 student reviews. Over 47 812 students have enrolled in the course. The length of the videos on this course is 21 hours, so until I can estimate this better, I will add 100% to my time estimate for completing the course.  The course is focussed on Python and introduces topics such as Numpy, Pandas, manipulating data, data visualisation, machine learning, basic stats, SQL and web scraping. Udemy often run specials on their courses, so I expect to pick this one up between $10 and $20. [Total time estimate: 50 hours]
  • Data Visualization and D3.js – Communicating with DataUdacity. This free course is part of Udacity’s Data Analyst nanodegree programme. The course provides a background in visualisation fundamentals, data visualisation design principles and will teach you D3.js. It is an intermediate level course that will take approximately 7 weeks to complete at 4-6 hours per week. [Total time estimate: 50 hours]
  • HackerRank challenges – HackerRank is a website that provides a very entertaining, gamified way to learn how to code. HackerRank offers daily, weekly and monthly coding challenges that reward you for solving a problem. The difficulty of the questions ranges from “Easy’ to “Hard”, and I plan to use this to test my new-and-improved Python skills. Every now and then I will use this form of learning Python as a “break” from the academic slog. [Total time estimate: n/a]

5.  Learn the basics of machine learning from both a practical and theoretical perspective

The resurgence of machine learning (the science of “teaching” computers to act without explicitly being programmed) is one of the key factors in the popularity of data science and drives many of the biggest companies today including the likes of Google, Facebook and Amazon. Machine learning is used in many recent innovations including self-driving cars, natural language processing, advances in medical diagnoses to name a few. It is a fascinating field, and as such, I want to gain a solid foundational understanding of this topic. It will also lay the foundation to understand the more advanced machine learning theory such as deep learning, reinforcement learning and probabilistic graphical models.

Machine Learning – Stanford University. Taught by Andrew Ng, this 10-week course is one of Coursera’s most popular courses and is rated 4.9 out of 5 from 39 267 student reviews. A commitment of 4-6 hours per week will be required.

Andrew Ng provides a comprehensive and beginner-friendly introduction to machine learning, data mining and pattern recognition and is based on several case studies and real-world applications. Supervised and unsupervised learning algorithms are explained and implemented from first principles, and machine learning best practices are discussed.

This course is a rite of passage for all aspirant data scientists and is a must-do. If you are on a more advanced level of machine learning understanding, look for the handouts of the CS229 Machine Learning course taught at Stanford (also by Andrew Ng) for further material. [Total time estimate: 80 hours]

Machine Learning A-Z Hands-On Python & R In Data ScienceUdemy. This course is a highly rated, practical machine learning course on Coursera. It is rated 4.5 stars out of 5 based on 11 798 student reviews. 86 456 students had signed up to this course at the time of writing. The videos total 41 hours and as before I will double this for my effort estimate.

The course is very hands on, and comprehensively covers topics such as data pre-processing, regression, classification, clustering, association rule learning, reinforcement learning, natural language processing, deep learning, dimensionality reduction and model selection. It can be completed in either R or Python. Again, I will look to pick this one up on a special for between $10 – $20. [Total time estimate: 80 hours]

6.  Our capstone project – let’s dive into the deep learning end

We have finally made it to what I regard as the curriculum’s capstone project – a practical course on deep learning:

  • Practical Deep Learning For Coders, Part 1 – fast.ai. Out of all the courses that I have looked at, I am probably the most excited about this one. Fast.ai’s deep learning course is a very different MOOC to the rest in that the content is taught top down rather than bottom up. What this means is that you are taught how to use deep learning to solve a problem in week 1 but only taught why it works in week 2.

The course is run by Jeremy Howard who has won many Kaggle challenges and is an expert in this field. The problems solved and datasets used in this course comes from previously run Kaggle challenges, which allows you to easily benchmark your solution to the best submitted entries.

A significant time commitment is required for this course – 10 hours a week for 7 weeks. The course teaches you some cool stuff such as such as how to set up a GPU server in the cloud using Amazon Web Services, and how to use the Python Keras library. As per the homepage for Keras, “Keras is a high-level neural networks AP developed with a focus on enabling fast experimentation. It is written in Python and is capable of running on top of either TensorFlow, CNTK or Theano.

As Jeremy Howard says, all you need to succeed in this course is pragmatic programming, tenacity, an open mind and high school math so good luck and well done on getting to this stage! [Total time estimate: 100 hours]

https://unsplash.com/collections/579786/knowledge-is-power?photo=esCc1qx6TVw

7.  Conclusion

So, we have finally made it to the end – well done! I have reviewed countless number courses in compiling this curriculum, and there were so many more that I wanted to add, including these more advanced topics:

I have also not touched on related topics such as big data, data engineering, data management, data modelling nor database theory of structured and unstructured sets of data. An understanding of these topics is nonetheless vital to understand the end-end spectrum that makes up the data analytics continuum. Nor have I chatted about the myriad data science tutorials and Kaggle-like data science challenges out there.

I intend to look at relevant tutorials and Kaggle problems where they relate to parts of this curriculum and where possible I will try implement some of these solutions on a big-data platform. While discussing this topic with one of my colleagues, he suggested also trying to build something big enough that encompasses all the above so that I can have an end-target in mind, don’t get bored and implement something that I am passionate about from the ground up. This is certainly something that I will also consider.

This challenge will start on 10 July 2017. According to my estimate, this curriculum will take 110 weeks or just over 2 years!! As daunting as this sounds, I take heart from Andrew Ng, the machine learning expert, when he said the following in an interview with Forbes magazine:

In addition to work ethic, learning continuously and working very hard to keep on learning is essential. One of the challenges of learning is that it has almost no short-term rewards. You can spend all weekend studying, and then on Monday your boss does not know you worked so hard. Also, you are not that much better at your job because you only studied hard for one or two days. The secret to learning is to not do it only for a weekend, but week after week for a year, or week after week for a decade. The time scale is measured in months or years, not in weeks. I believe in building organizations that invest in every employee. If someone joins us, I can look them in the eye and say, “If you come work with me, I promise that in six months you will know a lot more and you will be much better at doing this type of work than you are today

I hope that this quote resonates with you too and that the blog has helped or motivated you to improve your data science skills. Thank you for reading this and please keep me honest in terms of completing this challenge. Please post a comment if you think I should add to or change the curriculum in any way, and post your own course reviews — let me know if there are any other books and textbooks that I should consider. Expect updates soon!

by Nicholas Simigiannis

 

 

Human-centered design

Gone are the days when design used to be about aesthetic execution, when the main focus was to get as much work out the door as possible so that there was more time for more work — when brands spoke down to their consumers instead of speaking to them.

Gone are the days when design used to be about aesthetic execution, when the main focus was to get as much work out the door as possible so that there was more time for more work — when brands spoke down to their consumers instead of speaking to them.

Now most brands are waking up to the fact that we are living in an ever-changing, ever-growing, fast-paced world where the consumer has access to all kinds of information literally at their fingertips. And design plays a crucial part in the world we live in today. From the food we eat to the information we choose to consume online, design is everywhere.

Brands are cottoning on to the fact that their customers are more informed than ever before, and so big brands like Apple, Google, Uber, Airbnb, Facebook etc. are placing the consumer at the core of everything they do. This means that brands are now allowing their customers to decide on the type of content they want to consume and then designing for that. In the product design space this is so important — keeping the consumer at the center of everything that you do for a better experience.

Human-centered design is all about developing good relationships with your customer by delivering a high quality product that through prototyping and testing results in an emotional connection between the customer and the product.

                              Fig. 1 The human centered design pyramid (source: Giacomin, 2014)

Good design needs to be able to answer a set of key questions to facilitate this connection.

  • Who is the consumer?  Does the design reflect the user characteristics?
  • What are the consumers’ goals when using the product?
  • What is their experience when using the product?
  • What are the goals of using this specific product or service?
  • When and how does the consumer interact with the product design?
  • What do consumers think about the product or the design?
  • Why does the consumer want to use this product or design?
  • http://www.designorate.com/characteristics-of-human-centered-design/

 

Consumer feedback is key in ensuring that the design continually improves. And so, unlike before, the design process is never complete. Especially in the product design space, it is very important to keep iterating and making your product better with each iteration. Part of achieving that emotional connection with the product is about designing experiences as opposed to designing products.

The commonly used tools in building a human-centered approach are:

  • Personas;
  • Scenarios; and
  • Use cases.

 

Persona: This refers to creating fictional character that could potentially interact with your product. This usually includes their age, race, gender, location etc. Basically, it’s the target audience.

Scenarios: This would be the possible scenario of the persona using your product.

Use cases: This refers to the feedback gathered from the Persona through the Scenarios

It’s time that brands start immersing themselves in the worlds of their consumers if they want to remain relevant.

by James Mokhasi

Design Indaba made me do it –

This was the mantra for the 22nd annual Design Indaba conference, hosted by the beautiful city of Cape Town at the Artscape theater.

This was the mantra for the 22nd annual Design Indaba conference, hosted by the beautiful city of Cape Town at the Artscape theater.

The Design Indaba Conference has grown to become one of the world’s leading design events and hosts more than 40 speakers and 2 500 delegates. It draws creatives from all spheres and industries to come together under one roof to share knowledge, inspire and to collaborate with one another.

We talked, mingled and networked; filing our inspiration tanks. There were graffiti artists, dj’s, musicians, sculptors and various sponsor pop-ups and activation units, inviting us into this world of endless possibility and creativity.

Contrary to current perception, Design Indaba is not a conference ONLY for creatives – it is for everyone, from any field of expertise that would like to ignite their senses and intrigue their minds. It’s a jam packed 3 days and I believe that there is something that will speak to anyone’s core. This year was my first Design Indaba and it was a truly immersive experience, exceeding all my expectations.

The main highlight for me, wasn’t the skill or talent of all these amazing people (even though that was incredible) – but rather their thinking, this really stood out to me; they took us on a journey through the lens and into their magical minds!

Ultimately, Design Indaba wants to change the thinking of the world, one conference at a time, one creative at a time, and one business at a time.

It will take a generation of creative thinkers and implementers to see a turnaround. Design Indaba’s primary aim therefore is “to advance the cause of design as a communication fundamental, a business imperative and a powerful tool in industry and commerce, awakening and driving a demand for investment in intellectual capital”.

Investing nearly two decades in this vision, Design Indaba has championed the creative revolution. Here are some of my highlights from the 3-day event (content supplied from the Design Indaba weekly mailer):

The enchanted forest – Can beauty redeem us?

We were welcomed into the Design Indaba Festival 2017 through an enchanted forest of massive tree sculptors that were beautiful and surreal.

These tree sculptures were on exhibition the entire conference and created a magical ambience to the atmosphere in the festival court yard. I felt like I was walking around in a world that was a mash-up of the movies, Labyrinth and Alice in Wonderland (Tim Burton version).

Read more >

Capturing Cape Town’s scent with Kaja Solgaard Dahl

The thank-you gift for the festival this year was created by this designer, Kaja Dahl, she is fascinated with creativity that uplifts our experience and affect the senses directly.

Her process and the end-product is captivating and just incredible. She truly did capture the scent of Cape Town –whimsical, fresh, enlighten, yet eccentric.

Read more >

Masters in the art of freestyling it

One of my main highlights of the festival was the amazing group called Freestyle Love Supreme. They would wrap up each day with freestyle rap and beat boxing. They were so entertaining and funny, I laughed so hard that may face hurt.

The Design Indaba team chatted to Freestyle Love Supreme ahead of their Design Indaba daily wrap ups and once-off performance on the Thursday at Nightscape.

Read more >

 

 

Swahili launches on Duolingo

At Design Indaba 2017, Luis Von Ahn launches the first African language course on Duolingo. The audience went wild when he told us, he then went on to say that the second African language they will be launching will be Zulu. We can’t wait to see more African languages on this amazing app.

Read more >

Arch For Arch: A coda for Design Indaba Festival Day 3

The spectacular finale of the 2017 Conference and a tribute to Archbishop Desmond Tutu. It was a great honor and privilege for me to be a part of this amazing ceremony and to hear the incredible and humble, Archbishop Desmond Tutu talk. It was a great way to end the amazing festival, I left feeling inspired

Read more >

Thank you for the wonderful experience and we are looking forward to where they go from here.

So, if you think that design indaba isn’t for you – think again. Book your ticket for next year and immerse yourself.

by Mari-Liza Monteiro