For the longest time, many in the technology community were fixated on processor speed and number of transistors on a chip along with the ever declining cost to deliver that capability. This has held up as a critical yardstick to measure our progress and technology prowess. If this was the only metric, we have been wildly successful in that endeavor, but at what cost?
In the last blog post I authored, I suggested for business leaders, along with their respective Boards to begin planning now for the future, (post COVID 19). I advocated then that this was not necessarily about dusting off a playbook for how to “restart” a facility or a business but rather to see the impact COVID 19 is having on all their stakeholders. This assessment needs to be in the most broadest of terms for each individual’s perception of the changes brought on by the virus and how they wish to interact with each other, as a consumer, as an employee, as a member of a community and so on. Social norms are being upended; attributes for what we once thought of as good or admirable are shifting and that is the foundation on which each of us has to re-imagine the future. I believe that the rate of societal change caused by technology disruption/adoption has been accelerated by perhaps 3 years or more because of the virus. Linked to this technology migration is the awareness of each business’s and government’s supply chain fragility. Being flat footed when agility and speed are required is not going to be excusable in the future. Assuming that each organization has a multi year business plan, I recommend that they ignore what was written down for 2021 and 2022; start with 2023. It would be fair to assume that for most leaders, those future plans were heavily reliant on investing in digital technologies, not major capital expenditures. Quickly returning to profitability is top of mind for leaders, but doing so will come with its own costs.
So why is it time to consider a new Moore’s Law? Not being a social scientist, my observation is that because we came to believe in Moore’s precepts, that we were able to achieve the stated outcomes. (Perhaps the Hawthorne Effect in full display?) If in fact, we believe as business leaders that we are not turning the clock back to November 2019, then what becomes of all the workers who had a job back then? As I write this, more than 17 million American’s have filed for unemployment assistance. For those workers whose employment was on the path for disruption, they may not have enough runway to achieve a different outcome. When do we start to measure how many among us who want to work, but cannot find a job as the work is no longer required or totally displaced by technology? When do we start to focus on and rewarding for inventions and innovations that provide meaningful employment for workers whose jobs have already and will continue to vanish? What metrics would we want to be held accountable for delivering against, should we call it a law, a duty or just good business? Making Moore’s Law a reality took a tremendous amount of effort and innovation. Let’s make the future. What do you think?
Tim McCabe is a CGS Fellow helping to lead clients through significant transformation. He is a former global CIO and officer of Delphi Corporation where he helped to lead through the restructuring and digitization stemming from the largest bankruptcy in history. He can be reached at Tim.McCabe@CGSAdvisors.com