Don't Bet Your Career on Quantum Computing

Overexcited investors have, for years, touted the revolutionary impact quantum computing will have. However, if you take a step back and look at the current development of the field, it becomes obvious the technology will not 'revolutionise' every aspect of our lives for decades to come.
Recently (11/11/22), Forbes posted an article titled "How to Be Crypto-Agile Before Quantum Computing Upends the World". The expectation of QC as the next industrial revolution is already a sentiment spread across the top of the financial world. This sentiment is both misplaced and will prove disastrous to the actual research of QC when investors realise, they may not reap rewards in their lifetimes.
Take the example of IBM and Google, they use something called 'superconducting qubits' in their quantum computers. These currently need to be cooled to 10-20mK, a temperature so close to absolute zero, it has not been found to naturally occur in space. IBM's current highest qubit computer (the quantum variant of bits - thus a good marker for the prowess of a QC) sits at a 433-qubit system it launched recently (9/11/22), Google sits at 53. Current estimates suggest for a quantum computer to become commercially viable; it may require 100,000 to 1 million qubits. The current approach for the big companies is cooling these ever-growing systems to incredibly low (and expensive) temperatures in record-breaking 'super-refrigerators'. The challenge of just physically scaling these quantum computers to size may prove far more long-term of a hurdle than current news would suggest.
To create a parallel with another technology that underwent a similar over-hyping and stagnation, one can look to the artificial intelligence field. In the 1970s there was an incredibly strong push to develop AI. Scientists globally spoke loudly of how it would revolutionise the world in only the next few years. In 1970, Marvin Minsky stated "In from three to eight years we will have a machine with the general intelligence of an average human being" - we still haven't quite made it 50 years later. Much like QC, the development of AI was built off fundamentally good ideas. Provable advantages against old methods had already been seen in AI in the 1950s, in almost the same way 'quantum supremacy' has already been proven against classical computers in niche and non-applicable areas.
In the 70s, the barrier to AI was limited computing power. Today QC must overcome the hurdles of hardware and our limited understanding of what QCs are even going to be capable of in the future. When quantum investors finally come to the realisation that they may not live to see their investments bring about real global change, there will be a 'Quantum Winter' in which funding will stop. Thus, interest will wane, research will slow, and development of these quantum computers will be delayed another decade. An almost perfect parallel of the "AI winters" of the 70s and 80s will happen again if the expectation for quantum computing does not lower to realistic levels.
To summarise, quantum computing will change the world one day. However, you will never find it in your smartphone, and it will likely develop into a form no one expects today. The current misinformation about QC will probably come back to haunt its development and all of its overzealous proponents. Therefore, giving QC a wide berth and providing some much-needed scepticism would be a wise choice for the current climate.