Skip to main content

Future-casting: How will machine intelligence develop?

(Image credit: Image Credit: Enzozo / Shutterstock)

Most people are familiar with Apple iOS devices’ ‘intelligent assistant’ Siri and Microsoft’s Cortana, bank fraud detection and self-driving cars. Yet sometimes machine learning is included in artificial intelligence (AI). Most people confuse the two technologies. So what is the difference? Well with AI we can teach a machine how to win a poker match using a machine learning system. Another application for AI systems is unwittingly used by people every single day of the year – including when they’ve used their credit cards abroad. Oops, the card has stopped working! The reason being that the machine has been learning about your habits to assess whether an unusual transaction might present a risk of fraud. 

Most of the projects that have used AI in the past have involve big projects with very deep pockets. Furthermore, the industry has often hidden many AI projects away from the public due to the concern and uncertainty that a widespread adoption might create over job security and its impact on society. Take a step back though! This was the same fear that concerned many people in society with the introduction of the microprocessor. 

Human relevancy

According to CNBC, Tesla’s CEO Elon Musk told an attentive audience at the World Government Summit in Dubai in February 2017 that humans must merge with machine or become irrelevant in the AI age. "Over time I think we will probably see a closer merger of biological intelligence and digital intelligence," Musk told delegates. The billionaire paints AI in a way that would scare even horror movie fans, and goes so far as to claim that the payment of universal basic income will be needed when machines have taken over many people’s  jobs.  

But is this dark view of AI really justified? Certainly new technologies have taken people’s jobs in the past, and the UK government’s planned introduction of AI into many of its services is predicted to reduce the Civil Service head count by 250,000.  However, in the past when technology has taken over people’s jobs, those jobs have often been replaced by new ones. Arguably, for example, in the accountancy industry it could lead people away from data entry work to undertaking more analytical work that requires the use of a symbiosis of human intelligence and AI. With this in mind let’s also reflect on what happened way back in 1978 and on people’s thoughts about the microprocessor. 

Microprocessor age

The BBC broadcast an Horizon documentary programme called, ‘The Chips are down’. This not only explored the technology behind the microprocessor but also the effects mass adoption of these would have on society. These varied from creating mass unemployment in the manufacturing sector with the reduction in both skilled and unskilled workers to the exact opposite, creating wealth with everyone enjoying large increases in leisure time. 

Musk makes it sound like everyone will be replaced by AI. If we look back in history all the way to the start of the industrial revolution, where new technology is introduced there is a short term displacement whilst the new technology creates other industries around it. The microprocessor: did it lead to mass unemployment? Look at all those Personal Computers, Mobile Phones. Personal assistance, software companies, tablets, industries it has created. I’m not sure what happened to that extra leisure time it promised though. 

The introduction of the word processor certainly changed the office for every white-collar worker, especially the traditional copy typist. However, if we look at the long term unemployment trends of Western societies, despite the initial doom and gloom it has remained pretty much the same despite that fact that there are more people than ever make up the working population.  

Did we get more leisure time? Whilst for many the working week has reduced for many,  If we look at the average number of real working hour worked  by western working population it is increasing if we include unpaid after hours work and holiday time. 

Changing attitudes

The next generation has a very different view of technology from their elders.  It seems less concerned with privacy and how their online presence is used by companies to promote goods and services to them.  The need for personal contact with other people doesn’t quite reach the same level of expectation as with older generations too.  Personal contact used be very important part of communication.  Yet the next generation are happy to conduct this, and many other parts of their life, electronically.  I see this in my engineering staff in the way they research for information on products. 

In some ways the new generation prefers not to communicate verbally with other humans outside of their social group. In fact, most governments are actively pushing their citizens down this road.  This opens up the possibility to introduce AI into services and products without the resistance normally associated with sweeping technology changes. But, this doesn’t mean that for some, their jobs are safe. AI will change the white collar work patterns considerably - especially in the government services. 

Game changer

It is without doubt that AI and its application can be a massive game changer to many aspects of life and business. IBM is targeting medical and DNA analysis with its Watson program, and AI is being increasingly used because of its ability to spot breast cancer. To me this is very exciting, showing that the world of AI needn’t be a scary place as it can aid and support society to do good things. 

What’s more is that all of the data that we have been holding on to for all these years has real value now as educational material for AI. With this, big data analysis can really begin to sing. There are many other smaller applications where it can also add massive benefits – with all the sensors cars have all over them it should be able to spot or predict faults or components failures before they happen. Smart meters and IoT devices could help to reduce our carbon emissions by the more efficient balancing of demand with renewable energy and  highly polluting power stations idling on standby as a just in case. 

Machine learning predictions

The Big ML blog offers some predictions for 2017:

  • “Big Data” soul searching leads to the gates of #MachineLearning.
  • VCs investing in algorithm-based start-ups are in for a surprise.
  • #MachineLearning talent arbitrage will continue at full speed.
  • Top down #MachineLearning initiatives built on PowerPoint slides will end with a whimper.
  • #DeepLearning commercial success stories will be few and far in between.
  • Exploration of reasoning and planning under uncertainty will pave the way to new #MachineLearning heights.
  • Humans will still be central to decision making despite further #MachineLearning adoption.
  • Agile #MachineLearning will quietly take hold beneath the cacophony of AI marketing speak.
  • MLaaS platforms will emerge as the “AI-backbone” for enterprise #MachineLearning adoption by legacy companies.
  • Data Scientists or not, more Developers will introduce #MachineLearning into their companies.

IT industry gains

Even within our own IT industry there are gains to be made by the use of AI. Data storage and the movement of data is one. The way in which we control data still harks back to the start of the computer industry. In many cases it is still held in silos defined by humans and under the control of the servers and its policies – again defined by someone. Isn’t it about time that we let the data look after itself, let itself find the right place to be stored? Isn’t it also time we let the data in the hierarchy levels of storage decide where it should reside, define its backup policy on usage and protection? It seems that it because even the WAN optimisation solutions providers are introducing machine learning this year. 

Yet data acceleration solutions such as PORTrockIT have been using machine learning for a while now, so this leads to questions about what adding machine learning to WAN optimisation is going to change. The movement of data with the new high bandwidth WAN, be it for data back-up or data movement, is becoming more difficult. This seems counter-intuitive, and it therefore calls for a new approach that goes beyond traditional WAN optimisation to allow machine learning to automate tasks that might otherwise require manual intervention and the increased risks associated with human error.  This allows IT to focus on the other tasks that they have to undertake each day. 

WAN optimisation

Traditionally we have used WAN Optimisation to manage and optimise the flow of traffic across the WAN. But with the changing format we know transmit such as rich multimedia, compressed, encrypted or deduped data, the traditional WAN Optimisation technique of deduplication can no longer cope and much of the traffic is passed straight through without being optimised. 

This is resulting in poor utilisation and reduced Return on Investment on the new high-speed WAN   capabilities.  Beyond this great possibilities will emerge from machine learning and AI that will benefit mankind in managing IT networks, health, optimising resources, and in many other areas. I therefore think that future of machine learning and AI brighter than the one presented by Musk. 

David Trossell, CEO and CTO, Bridgeworks
Image Credit: Enzozo / Shutterstock

David Trossell
David Trossell is CEO and CTO of award-winning data acceleration company Bridgeworks, which has developed products such as PORTRockIT. It is the winner of the DCS Awards 2018 for the Date Centre ICT Networking Product of the Year category; and it won the DCS Awards 2017, Data Centre ICT Networking Product of the Year.