Software 2.0: What is it and why is it important?

Software 2.0: What is it and why is it important?

Credit: Artibeus

You’ve seen classic Software 1.0 coding structures: meet Software 2.0 in which code can be written by an optimisation in the form of neural network training.

Software didn’t just revolutionise computing, it changed human culture forever.

In a little under a century, we went from being constructors, animals that manually built our tools, to the only species on Earth to program tasks; fighting wars with bayonets to presidents being granted the codes for nuclear missiles. We moved from generally settling in small communities and keeping ourselves to ourselves, to branching out a little more than before, moving to corners of the globe and keeping in touch via digital means. We transitioned from books and written knowledge to effectively democratising learning, enabling autodidacts to teach themselves about the world via Wikipedia and YouTube.

The words that you’re reading are possible because of software. The device you’re reading them on, too. So what if we could upgrade software? What if a shockwave rips through the very fabric of the digital world? What could the next age humanity look like? Enter Software 2.0, the idea that software could write itself.

This is a familiar concept to those interested in machine learning. This new brand of software is able to be written in more complex languages to Software 1.0; there’s no need for a human to understand exactly what’s being processed. Software 2.0 is written by a neural net: instead of programming into the language everything that you want to be produced as a final result, Software 2.0 can be seen more like a shopping list of things you want the computer to do itself.

Languages such as Tensorflow and Keras program in a similar way to this. By statistically defining the architecture of the neural net, you can then train your data to produce a final result.

How do neural networks work?

Neural networks are sets of algorithms based on biological brains. They interpret data by recognising, labelling and grouping and aim to replicate how we think in order to get a more intuitive, realistic result than something that needs to be plugged in with code

Take a spam filter for example. If you use a collection of emails as data, then an algorithm could spot spam from genuine emails, perhaps from the addresses. As the algorithm analyses more emails, it may discover that the language used in spam varies from non-spam. A neural network will begin to learn the subtler aspects of spam and filter them out.

This is radically different from how coding works: coding does not learn. You can code into an email program to place suspicious-looking email addresses into a folder but it won’t pick up on why.

Software 2.0: How neural networks work

Rather than the output being a sum of the input data, hidden layers are introduced between the ins and outs of the formula. With several layers between the input and output data, all computing, there needs to be what is called “forward propagation” in order to receive the output and compare it against the data.

Neural networks might sound complex but in reality, they could end up simplifying Software 2.0. The instruction set of a neural network is small and it is easy to place a neural network onto a low-cost chip. Plus, if different Software 2.0 modules could interact, it could be possible for a web browser, for example, to automatically translate different systems for better efficiency.

Will Software 2.0 catch on?

Human beings are complex and tend to be stuck in two opposing directions. We’re simultaneously stuck in our ways, almost fearful of change, yet excited by innovation and possibility of the world around us developing. Are we too set in our existing ways for Software 2.0 or is this an upgrade we can’t ignore?

Software 2.0 offers a number of benefits that can’t be ignored by developers. From its portability to its potential to work on low-power consumption, this new way of working could be much more efficient than Software 1.0 in a number of respects. On the other hand, we still don’t know 100% of what there is to know about neural networks – the brain is famously complicated, after all – and so when mistakes and failures naturally crop up, it can be difficult find them and eradicate them.

Perhaps the biggest benefit, however, is simply that the 2.0 version of software would be a lot easier day-to-day, assuming of course that it everything goes to plan. If an algorithm is difficult to create, it makes sense to create it in Software 2.0: it’s a more intuitive way to code.

It’s unclear just yet as to whether Software 2.0 will change the planet or just adds another dimension to developing. What’s certain though is that this new way of writing code will question existing structures.

Luke Conrad

Technology & Marketing Enthusiast

Ab Initio partners with BT Group to deliver big data

Luke Conrad • 24th October 2022

AI is becoming an increasingly important element of the digital transformation of many businesses. As well as introducing new opportunities, it also poses a number of challenges for IT teams and the data teams supporting them. Ab Initio has announced a partnership with BT Group to implement its big data management solutions on BT’s internal...

WAICF – Dive into AI visiting one of the most...

Delia Salinas • 10th March 2022

Every year Cannes held an international technological event called World Artificial Intelligence Cannes Festival, better known by its acronym WAICF. One of the most luxurious cities around the world, located on the French Riviera and host of the annual Cannes Film Festival, Midem, and Cannes Lions International Festival of Creativity. 

Bouncing back from a natural disaster with resilience

Amber Donovan-Stevens • 16th December 2021

In the last decade, we’ve seen some of the most extreme weather events since records began, all driven by our human impact on the plant. Businesses are rapidly trying to implement new green policies to do their part, but climate change has also forced businesses to adapt and redefine their disaster recovery approach. Curtis Preston,...