How Programming is Set to Change in the 2020s

As we enter a new decade, the achievements we’ve reached in the previous one suddenly stand out so much clearer. Big data and artificial intelligence have become centric in modern technology, for example, and the developments made in the wider programming world have had worldwide impacts.

Just think back for a moment. At the beginning of the 2010s, we didn’t even have iPads. Raspberry Pi didn’t exist. Microsoft was still running on Windows 7. 

And now? 

iPads are ubiquitous, Raspberry Pi has been left in the dust by other, superior primitive coding platforms, and Microsoft is investing in genuine intelligence. Oh, and Google has just claimed to have achieved quantum supremacy

Given how much has happened in the last 10 years, it’s intriguing to think about what might happen in the next. Here are our five predictions for how programming and computing will change in the decame to come.

 

Greater focus on AR and VR 

Augmented reality (AR) and virtual reality (VR) have already had a big impact in recent years, most prominently in the gaming world. The Oculus Rift and other similar devices managed to out a brand new corner of the market for themselves, and games that had an AR element to them – take Pokemon Go, for example – proved to be wildly popular.

Despite the technology having been around for more than a decade, though, it is expected to become more relevant in the imminent future. In fact, by 2025, AR and VR are expected to attain a value of over $25billion

Of course, this isn’t all from gaming. On the whole, AR and VR technology will probably have a more noticeable impact on the healthcare industry, with ‘telemedicine’ (the concept of being able to see a doctor without actually leaving your home) and ‘telehealth’ (the reverse: being able to monitor a patient from a remote location) already on the horizon. AR and VR are also expected to improve surgeries, both through programs used for training purposes and tools that will help physicians in real time.

Furthermore, according to HackerNoon, “Virtual reality in the education space is predicted to be a $200 million industry by 2020 and 97% of students today would like to opt for a VR course.” 

They continue: “According to reports, the education sector is at the 4th place for AR and VR investments. Around 80% of teachers prefer to have access to virtual reality devices, and only 6.87% use them in the e-learning process.”

Needless to say, then, we will see greater use of languages that facilitate AR and VR technology. According to TechRepublic, the top languages for this process at the moment are:

  1. C#
  2. C/C++
  3. Java
  4. JavaScript
  5. Python
  6. Visual development tools
  7. Swift

Who’s to say that this won’t change in the coming years, however.

 

AI to increase in importance

AI’s influence on the computing world has already been staggering, but it is still far from reaching its maturity. As such, the onus on coding languages to be able to carry the weight of programming AI capabilities is immense.

As we’ve touched on before, Python is arguably the best choice for machine learning, with many experts singling it out as being superior to its close competitors, Java and C++.

“Python is known to be an intuitive language that’s used across multiple domains in computer science,” a 2018 Cloud Academy report stated. “It’s easy to work with, and the data science community has put the work in to create the plumbing it needs to solve complex computational problems. It could also be that more companies are moving data projects and products into production.”

Just as engineers are programming AI technology, however, AI technology is beginning to take some of the strain off of engineers. Automation is creeping in to every industry – and software engineering is no exception. 

This is partly a good thing, as automating basic functions allows programmers to focus on more complex tasks. However, in the long run, it could spell job shortages for tech professionals.

“I can envision systems that become better and better at writing software,” said Bart Selman, a computer scientist at Cornell University. “A person complemented with an intelligent system can write maybe ten times as much code, maybe a hundred times as much code. The problem then becomes you need a hundred times fewer human programmers.”

This almost certainly won’t happen within the next decade: the technology is nowhere near as advanced as it would need to be. But it is going to happen eventually, and software engineers are currently the ones who are quite possibly programming themselves out of a job.

Of course, one could argue that AI will never surpass human intelligence, and therefore won’t be able to do a better job of programming than a real person would. But that, unfortunately, isn’t true. As Selman goes on to explain, chess programs don’t have to be written by grand masters. If you know the rules, and can teach a computer the rules, the chances are that it will outmatch you with ease.

 

Data is essential

At present and continuing into the next decade, data is and will continue to be at the root of programming. And, as big data goes hand-in-hand with AI, there is again some discussion around which language should be utilised for the task.

Unsurprisingly, Python is often hailed the champion. 

As i-Programmer explains, “Python is the most popular language used by data scientists to explore Big Data, thanks to its slew of useful tools and libraries, such as pandas and matplotlib. Python also has excellent performance and scalability for data science tasks., and it can be used with fast Big Data engines such as Apache Spark via the available Python API.”

Software engineers who work with big data will become more valuable, and so those who are just starting out in their career ought to consider this an option.

 

Coding will become a standard school subject

The jury is still out on this one, with some experts claiming that coding ability is on the precipice of becoming as essential as literacy or numeracy, and others arguing that it is about as necessary as needing to know how to fix a car engine: useful, but there will always be someone else to do it if you, personally, don’t know how. 

With that being said, there has been a significant increase in the number of children that are learning at least basic programming skills.

This is partly due to organisations such as Code.org, “a nonprofit dedicated to expanding access to computer science in schools and increasing participation by women and underrepresented minorities.” On their website, they say their vision “is that every student in every school has the opportunity to learn computer science, just like biology, chemistry or algebra,” and their impact has already been significant.

Even amongst proponents for teaching children some beginners’ engineering skills, however, there is debate on how exactly the subject should be taught. Annette Vee, an English professor who’s written a book about programming, argues that it should be taught more as a language than a computer science skill.

“Programming is too important to be left just to computer science departments,” she says. 

And, to counter those who argue that it is an unnecessary skill that not everyone will need to use in a profession, she says: “If we assume that those who learn to write need to be English majors, we would be in trouble.”

 

A universal language could emerge

Compared to the other points on this list, this – to some, at least – may actually seem to be the least believable. 

Sure, Python may have cropped up a few times where AI and big data is concerned, but it is not the be all and end all of coding languages. There are hundreds of options out there, each with features that make them more or less suited to certain tasks, so it will never take over and become universal.

However, we should not think of the languages we have now as ones that have to battle it out to become the only one, but rather as very primitive forms of languages that are yet to evolve into something more streamlined and multi-functional.

“Python may be remembered as being the great-great-great grandmother of languages of the future, which underneath the hood may look like the English language, but are far easier to use,” says Karen Panetta, an IEEE fellow and dean of graduate engineering at Tufts University. “Programs will be built using coding blocks, like the wooden alphabet blocks we used when we were children. Developers will be able to connect the blocks to implement whatever functionality they need, and the blocks may not even be required to be written in a textual form.”

Realistically, this won’t happen in the next decade – mostly because it’s just not practical. However, it’s on the list because we could very well see the dawn of this new language. It might need some time to properly take hold but, by 2030, it could already be in use somewhere. 

 

Looking ahead

Of course, none of this is certain. A whole new breed or hybrid of technology could emerge and steer the course of programming in a completely different direction to what we have been told to expect.

But what we can hope for, at least, is an emerging decade in which coding and software engineering is valued greater as a mainstream skill, and one where that skill is put to world-changing use. 

The technology we will see in the imminent future will no doubt have a huge impact – not just in the medical industry, as we briefly touched on with AR and VR technology, but also in transport, communication, connectivity, sustainability, and accessibility. The technological world we live in is gradually being reshaped, and programming is at the heart of it all.

For more articles like this, plus industry insights and job opportunities, make sure to sign up to the Geektastic Hiring community.

Related articles

Looking to hire?

Let us help you to match with experts.