Is Artificial Intelligence a Thing of the Past?
BASIC, C, C++, COBOL, FORTRAN, Ada, Pascal.
For those of you who spot a foreign language in your midst, you are absolutely correct: the above are examples of computer programming languages learned and perfected by many a programmer around the world.
When you want to speak to a computer to get it to do what you want it to do, these are your go to languages to learn. They are very real, albeit constructed languages; there is vocabulary, there are grammar rules, and there is even ‘shorthand’ or ‘colloquial’ language for those in the know.
But today with deep learning, natural language processing, and everyone from Google to Facebook not only reading our words but predicting what we are going to say next, are the coveted programming languages of the select few at danger of becoming or already being obsolete?
Those already in the know…
We spoke last year about computer programming being learnt as a second language, with children in some schools learning the language of Python instead of French. And for those of us who dabble with sites such as WordPress and Tumblr, there are a fair few of us out there that already speak a little html. True: perhaps knowing the code for the simple functions of italicise, bold and space are not truly enough to say we know the language, but when bloggers are tapping away at their laptops and adding these small sections of code automatically rather than having to look them up, it seems they have been absorbed into (some people’s) everyday vocabulary.
A history lesson…
Not so much of a history lesson, rather an observation of the dwindling strictness of programming languages. Gone are the times when vast amounts of coding needed to be done in order to have the simplest of functions happen. Of late, although every line of code still needs to be written by a human, that code is becoming more and more efficient, taking up less time, and space, and streamlining the process for us all.
There are parallels here to telegraphy back in the nineteenth century. During this time every dot and dash of Morse code was translated into human-talk, typed by human hands so that human eyes could see and understand it. We no longer send telegraphs and Morse code nowadays is the tool of amateur radio enthusiasts, the US Navy and Coastguard when signalling lamps and in some other areas of the aviation and aeronautical fields. But telegraphists were once highly sought out, an elite of the general workforce. Is the same fate awaiting our generations of software developers and engineers?
You can talk to your Android device to find out who won the first Nobel peace prize in physics (Wilhelm Conrad Röntgen, 1901, if you are pub quizzing) and Siri will tell you when the next tram is due. It seems that even when we run out of actual people to speak to, our devices will always be available to talk back to us, and give us all the information that we want. Apple is even sending Siri back to school to make her smarter and easier to communicate with, by taking on Carnegie Mellon University professor Ruslan Salakhutdinov, an expert in deep learning and neural networking that will potentially help Siri to think more like us.
Despite our growling list of demands at Google and Siri, technology is not yet at a point where we can train it like we would a dog to perform things we would expect of a fellow human, and this is where our programmers come in - and why we think their jobs will always have relevance. As the capacity for human growth constantly shifts, so do our wants and needs: Maslow's hierarchy of needs still makes sense for us today, yet it doesn’t explicitly mention things like WiFi and ergonomic office equipment. And yet those things still fall somewhere within that spectrum and are very much the essentials of our day to day lives.
Learning a new language? Check out our free placement test to see how your level measures up!
Perhaps what is fairer to predict for the future of programming is a shift in skills. In order to become a computer programmer still requires vast amounts of studying, yet just about anyone can watch a YouTube video and programme the most basic of things for themselves without having to rely on an external source. It is fair to say that the gap between what we want our technology to do for us and making it understand our wants is closing up, and that the developers/Gandalfs guarding the edge of that precipice no longer stand the same chances of stopping us/earning money from us, when it comes to us getting what we want. But at the end of the day, we are all mere Red Bull Flugtag pilots when it comes to the most complex of technology, and doomed to stumble without professional guidance when needing to do something difficult.
The way of things…
Programmers, then, your livelihoods are safe; we still need you, even if not in all the ways we used to. As technology and artificial intelligence develops there are some things we simple software users can and will learn to do for ourselves, yet it seems there will always be a need for a translator when we want our computers to do something more complex than just respond to our basic commands.
A final thought. Should the day come when we can simply speak at our computers and have them do precisely what we want of them, what language will we be speaking in? English, the lingua franca that ‘everyone’ communicates in? Esperanto, the proposed international language that has no home and therefore technically no biases? Or Japanese, potentially the most rapidly advancing technological countries in the world?