Heat of (tomorrow's) moment

What will the job of a computer engineer look like in 20 years?

Call To Heat — Jul 2021 by Luís Alberto Almeida

That's a difficult question. Technology is changing so fast today that in 20 years, an entirely new world will stand before us. According to most forecasts, half of the children in school today will work in jobs that don't exist yet. It's true!

Many of the Engineering jobs that will be common in 20 years have not been invented, and they are going to be heavily influenced by the revolution that lies ahead. 

One might think that if we look back into the past, we can glimpse the future. Perhaps, but if we want to really understand how things may change in the next twenty years, we need to look way back, maybe 100 years, and see the waves of disruption that happened during that time. Eons of evolution are being squeezed into a "short" period, and everything is moving like a gigantic, unstoppable wave. This wave is fed by daily discoveries, vast amounts of data and information, knowledge, people's creativity, and so on.

So, how can one realistically preview what the future holds for a computer engineer? Well, we can make educated guesses based on current trends and emerging technologies. This article will look into just a few of those technologies and provide my view on how things will turn out. There are many other exciting topics that I would like to discuss, like no-ops or no code, virtual/augmented reality, edge computing, 6g, etc., but I'll leave them for another time. So, buckle up.

 

1.    Artificial Intelligence and Machine Learning

Although Artificial Intelligence and Machine Learning were born much earlier than that, I gained particular awareness of them back in 1996, in the epic chess battle of Garry Kasparov against Deep Blue from IBM. Kasparov won the first set of games in 1996, so the team behind Deep Blue went back and worked relentlessly to improve the AI. They increased the processing power, the knowledge base, the database of past games, and many other things. In 1997, Deep Blue and Kasparov replayed their six-game match. The chess-master won the first game, Deep Blue won the next one, the following three ended in a draw, and the final one was won by Deep Blue again. What a match! Even today, many people ask how the machine managed to win against one of the greatest chess masters alive. There are several texts outlining how they did it, but my question is the total opposite: how did Garry Kasparov beat Deep Blue, a computer capable of going through 200 million possible chess positions per second, had access to millions of games already played and understood advanced playing techniques?

This "men vs. machine" event was twenty-four years ago when Machine Learning was rapidly evolving and pushing the limits of Artificial Intelligence. Machine Learning is a form of data analysis, where systems learn from data, identify patterns, and try to make informed decisions with little or no human intervention. Today we're in the era of Deep Learning, where computers use advanced algorithms to create neural networks that can learn from data and make intelligent decisions independently. Computers are doing remarkable things with the sheer amount of data at their disposal. For instance, GPT-3 from OpenAI can write software applications based on natural language: you describe what you need, and the program generates the code all by itself. It's awe-inspiring.

AI/ML jobs are a premium role today, but they will be everywhere in twenty years. Engineers will need to understand and work with AI tools on a daily basis. There'll be highly advanced and optimized chipsets for AI running inside your PC, car, mobile, TV, microwave, or any IoT device. AI will be omnipresent and, in some cases, omniscient.

Some say that we're engineering ourselves out of existence. I don't believe this, although we do need to exercise caution. Sophia, a social humanoid robot from Hanson Robotics, gave a canned response when asked how humans and robots will work together. "Robots can free humans from the most repetitive and dangerous tasks, so they can use their time in what they are best, being creative and solving complex problems. Robot intelligence does not compete with human intelligence; it completes it".

That's right. Creativity and imagination will be THE best-valued assets of Engineering roles in the future. You simply cannot automate them. That's why, 20 years ago, Kasparov still managed to beat Deep Blue against all odds.

 

2.    Quantum Computing

In 1965, an Engineer named Roger Moore noticed an interesting trend: the number of transistors in an integrated circuit doubled every 18 months. In layman's terms, this meant that the processing power of computers was doubling every couple of years, and the cost of production cut in half in the same period. Several years have gone by, but Moore's Law continues to hold today, albeit steadily slowing down. There are, however, limits to the number of transistors that can fit into an integrated circuit. So, the question is, what'll happen next? Will computers stop advancing at a certain point? Certainly not. Present computer architectures will evolve and explore other directions, and new ones will appear or mature. Which ones? Let's talk about quantum computing.

Information in Quantum computers is stored in qubits instead of bits. Like standard computers, they represent data as zeros or ones, but they can also assume both values through something called superposition in quantum mechanics. In simple terms, they can be 35 percent zero and 65 percent one, for instance. The result is that, while a conventional computer with n bits can do up to n calculations at once, quantum computers can perform two powered to n. Furthermore, there's also a law for quantum computing (Neven's Law), which states that computing power is achieving "doubly exponential growth relatively to conventional computing". That's interesting: all of a sudden, the law that said that processing power doubles every couple of years in silicon-based computers, is exponential in the quantum world.

Unfortunately, quantum computing is so fundamentally different from a conventional computer that it'll take some time to gauge the benefits. "The differences between quantum computers and classical computers are even more vast than those between classical computers and pen and paper", said Peter Chapman, CEO of quantum startup IonQ. While in a standard computer, you take an input, apply an algorithm, and expect a clear output, quantum computers provide you with an estimate of how probable an answer is. So, we go from a deterministic to a stochastic model, which people are not very used to in the programming world. However, this can be pretty useful when working with problems that have different inputs and complex scenarios to evaluate.

There are a plethora of possibilities and areas that can take advantage of quantum computing, but I want to highlight just a couple of them:

  • Artificial intelligence - as we've already seen, AI is rapidly expanding due to the sheer amount of available data, evolution in algorithms, and the increase of processing power. Unfortunately, that processing power is still not enough for what engineers and scientists want to achieve. Jump to the quantum world, and the possibilities are unlimited. 
  • Cryptography - today, security protocols rely on different mathematical problems or techniques to generate a password. These techniques are the perfect target for quantum computers, as they can easily use brute force to solve them and compromise security. Researchers are already working on quantum-safe cryptography, but there's a steep road ahead. Imagine the y2k bug on steroids: instead of fixing a date issue, you need to replace an entire set of security algorithms in a billion programs around the world. In the end, however, we'll have a much more secure infrastructure to support the years ahead.

All being said and done, Quantum computers won't be a commodity anytime soon. They need to be kept at temperatures close to absolute zero, which means that it is very unlikely that you and I will have one under the table in the foreseen future. Still, it's a perfect opportunity for cloud providers like Microsoft, IBM, Amazon, and Google. They already provide tools to explore quantum computing, so it's not a question of "if", but "when". 

 

3.    Blockchain

Satoshi Nakamoto first introduced the term blockchain in 2008 in a paper describing a peer-to-peer electronic cash system. The following year, it saw its first implementation as the underlying technology supporting the cryptocurrency bitcoin. In simple terms, blockchain is an immutable, decentralized, and distributed digital ledger containing records of all transactions across the network. Transactions can involve anything like currency, contracts, documents, etc. The term blockchain comes from the fact that transactions are stored on blocks chained or linked together. Since participants and transactions are digitally signed, they cannot be tampered with after being saved in the chain. Also, since anyone can contribute and support the blockchain by running a node, it's fully distributed and has no central authority. The magic happening in the blockchain is in the process it uses to validate transactions in this fully distributed world. It does this through something called consensus, which is a mechanism by which nodes can agree on the state of the network at a given moment. There are several consensus mechanisms, like PoW, PoS, or PoC, each with its pros and cons.

Although being around for more than ten years, we did not see mainstream adoption of blockchain yet. Both consumers and enterprises are in the early adoption phase, experimenting with the technology and coming up with solutions around it. There are several reasons people are cautious, as the blockchain comes with its problems, like scalability, privacy, regulation, etc., but they are being worked on as we speak. The blockchain ecosystem is maturing every day around the core principles it was born. And although cryptocurrencies are the pervasive use case for blockchain today, they are plentiful.

The question is, where will blockchain take us? Will it ever reach mass adoption? Will it reach ubiquity in 2040? I believe so. From the moment we start seeing a growing number of solutions incorporating the blockchain in industries like banking, cybersecurity, supply chain, healthcare, and government, trust will grow around consumers and other industries. The barrier to entry in these areas is quite significant due to heavy regulations, but at the same time, they are the ones that can benefit the most. We need to surpass today's paradigms and dogmas to achieve breakthrough results.

This means that blockchain jobs will be mainstream since most solutions will benefit from it one way or the other. Blockchain usage will be as standard as a database today. 

 

4.    Cloud-based Services

Cloud computing is everywhere. Most of the services we use today are supported by the cloud one way or the other, in one of its different XaaS models (Software, Infrastructure, Platform, Communication, Network, etc.). During the covid 19 pandemic, millions of people had to work from home, which was a massive test to remote working platforms like Zoom, Teams, or WebEx. Although this was a test to fail, most users didn't see any impact on their daily use; I certainly didn't. This was because the platforms silently upscaled their infrastructure to face the growing needs of their customers. It was an impressive "exercise" of planning, engineering, and operations.

In the future, applications built on top of cloud-based services will be the de-facto standard. Terms like "cloud-ready/native" will be gone, as they'll be no other way. Infrastructure jobs will steadily decrease as companies will be running most of their services in the cloud. Of course, cloud providers will need those infrastructure engineers, but even then, they will be heavily supported in their job by a high degree of automation and AI tools. We've seen that happen time and time again. Companies start by investing vast amounts of money in their infrastructure and staff, but they ride the cloud movement years later. If this is happening right now, imagine how things will play out in 2040.

There are several excellent examples of this, but I'll highlight just one. Netflix has more than 200 million members in more than 190 countries, with 125 million hours of TV shows and movies viewed each day. Netflix uses AWS for most of its computing and storage needs, including databases, analytics, recommendation engines, video transcoding, and more, which employ more than 100,000 server instances on AWS. With this, Netflix can lower prices, give better products, foster innovation, improve time to market, etc. In summary, focus on providing a better product and experience to its customer.

Cloud computing has always been at the forefront of the digital revolution, supporting and advancing emerging technologies due to the number of resources at their disposal. Artificial Intelligence, quantum computing, and others are being offered today as a service in several platforms, giving people the means to explore or build a new set of solutions around these technologies.

So, this is it, a forecast of the future. Maybe it'll be like this, or maybe not. But does it matter? The train is moving, and it'll get there whether we're aboard or not, so we might as well catch the ride.
 One thing I know for sure: it'll be one hell of a journey. And, whatever the future holds, it'll be a wonderful new world to live in.

 

References:

http://hasler.ece.gatech.edu/Published_papers/Technology_overview/gordon_moore_1965_article.pdf

https://builtin.com/software-engineering-perspectives/quantum-classical-computing

https://www.quantamagazine.org/does-nevens-law-describe-quantum-computings-rise-20190618/

https://research.aimultiple.com/quantum-computing-cloud/

https://www.businessinsider.com/quantum-computing-investing-computers-enterprise-2021-3

https://www.simplilearn.com/tutorials/blockchain-tutorial/blockchain-industries

https://101blockchains.com/blockchain-adoption-challenges/

https://www.planetcompliance.com/5-reasons-for-the-speedy-adoption-of-blockchain-technology/

https://www.geeksforgeeks.org/consensus-algorithms-in-blockchain/

https://www.livewebinar.com/blog/webinar-marketing/50-video-conferencing-statistics-for-the-year-2020